Artificial Intelligence and Quantum Computing for Advanced Wireless Networks. Savo G. Glisic
Читать онлайн книгу.3.4 Illustration of backpropagation.
3.2 FIR Architecture
3.2.1 Spatial Temporal Representations
Most often in engineering, prior to becoming a member of the observation set, the input signals to the neural network have gone through some form of filtering. This also coincides with the form of potential maintained at the axon hillock region of the neural cell. With this in mind, we may modify Eq. (3.1) as
(3.15)
By adding filtering operations, we have included the equally important temporal dimension in the static model. For our purposes, we will now be interested in adapting the filters. To this end, we assume a discrete FIR representation for each filter. This yields
(3.16)
with k being the discrete time index for some sampling rate Δt, and wi(n) being the coefficients for the FIR filters. In the following, we will represent the vector wi = [wi(0), wi(1), … , wi(M)] and the delayed states as xi(k) = [xi(k), xi(k − 1), … , xi(k − M)]. Now, a filter operation is written as the vector dot product wixi(k), with time implicitly included in the notation.
The top part of Figure 3.5 shows a standard representation of an FIR filter as a tap delay line. Although this filter represents several biological processes, as well as many engineering solutions, for ease of reference to a real neuron network we will refer to an FIR filter as a synaptic filter or simply a synapse. The output of the neuron will be as before y(k) = f(s(k)) with f (x) = tanh (x), and we have added only a time index k.
We use the same approach to network modeling as in the previous section. Each link in the network is now created using an FIR filter (see Figure 3.5). The neural network no longer performs a simple static mapping from input to output; internal memory has now been added to simple static mapping from input to output. At the same time, since there are no feedback loops, the overall network is still FIR [2–5]. The notation now becomes
For all filters in a given layer, we will assume that the order Ml is the same. The activation value
3.2.2 Neural Network Unfolding
An interesting, more insightful, representation of the FIR network is derived by using a concept known as unfolding in time. The general strategy is to remove all time delays by expanding the network into a larger equivalent static structure.
Figure 3.5 Finite impulse response (FIR) neuron and neural network.
Table 3.2 Finite impulse response (FIR) multi‐layer network notation.
|
Weight connecting neuron i in layer l − 1 to neuron j in layer l |
|
Bias weight for neuron j in layer l |
|
Summing junction or neuron j in layer l |
|
Activation value for |