Artificial Intelligence and Quantum Computing for Advanced Wireless Networks. Savo G. Glisic

Читать онлайн книгу.

Artificial Intelligence and Quantum Computing for Advanced Wireless Networks - Savo G. Glisic


Скачать книгу
href="#fb3_img_img_5672ddb4-abcb-5651-9646-2916fe397d8a.png" alt="partial-differential e squared left-parenthesis k right-parenthesis slash partial-differential b equals minus 2 e left-parenthesis k right-parenthesis u left-parenthesis k right-parenthesis comma"/> which yields the standard LMS update Δb(k) = 2μe(k)u(k). For filter a, we have

      (3.37)StartLayout 1st Row partial-differential e squared left-parenthesis k right-parenthesis slash partial-differential a equals minus 2 e left-parenthesis k right-parenthesis partial-differential y left-parenthesis k right-parenthesis slash partial-differential a 2nd Row equals minus 2 e left-parenthesis k right-parenthesis partial-differential left-parenthesis b 0 u left-parenthesis k right-parenthesis plus b 1 u left-parenthesis k minus 1 right-parenthesis plus b 2 u left-parenthesis k minus 2 right-parenthesis right-parenthesis slash partial-differential a 3rd Row equals minus 2 e left-parenthesis k right-parenthesis left-bracket b Subscript o Baseline partial-differential u left-parenthesis k right-parenthesis slash partial-differential a plus b 1 partial-differential u left-parenthesis k minus 1 right-parenthesis slash partial-differential a plus b 2 partial-differential u left-parenthesis k minus 2 right-parenthesis slash partial-differential a right-bracket 4th Row equals minus 2 e left-parenthesis k right-parenthesis left-bracket b Subscript o Baseline x left-parenthesis k right-parenthesis plus b 1 x left-parenthesis k minus 1 right-parenthesis plus b 2 x left-parenthesis k minus 2 right-parenthesis right-bracket period EndLayout

      which yields

      (3.38)normal upper Delta a left-parenthesis k right-parenthesis equals 2 italic mu e left-parenthesis k right-parenthesis left-bracket b Subscript o Baseline x left-parenthesis k right-parenthesis plus b 1 x left-parenthesis k minus 1 right-parenthesis plus b 2 x left-parenthesis k minus 2 right-parenthesis right-bracket period

      Here, approximately 3M multiplications are required at each iteration of this update, which is the product of the orders of the two filters. This computational inefficiency corresponds to the original approach of unfolding a network in time to derive the gradient. However, we observe that at each iteration this weight update is repeated. Explicitly writing out the product terms for several iterations, we get

Iteration Calculation
k e(k) [ StartEnclose box b Subscript o Baseline bold x left-parenthesis k right-parenthesis EndEnclose + b1x(k − 1) + b2x(k − 2) ]
k + 1 e(k + 1) [ box(k + 1) + StartEnclose box b 1 bold x left-parenthesis k right-parenthesis EndEnclose + b2x(k − 1) ]
k + 2 e(k + 2) [ box(k + 2) + b1x(k − 1) + StartEnclose box b 2 bold x left-parenthesis k right-parenthesis EndEnclose ]
k + 3 e(k + 3) [ box(k + 3) + b1x(k − 2) + b2x(k + 1) ]

      Therefore, rather than grouping along the horizontal in the above equations, we may group along the diagonal (boxed terms). Gathering these terms, we get

      (3.39)left-bracket b Subscript o Baseline e left-parenthesis k right-parenthesis plus b 1 e left-parenthesis k plus 1 right-parenthesis plus b 2 e left-parenthesis k plus 2 right-parenthesis right-bracket x left-parenthesis k right-parenthesis equals delta left-parenthesis k right-parenthesis x left-parenthesis k right-parenthesis

      where δ(k) is simply the error filtered backward through the second cascaded filter as illustrated in Figure 3.8. The alternative weight update is thus given by

      In general, here we deal with the problem of predicting future samples of a time series using a number of samples from the past [6, 7]. Given M samples of the series, autoregression (AR) is fit to the data as

      (3.41)y left-parenthesis k right-parenthesis equals sigma-summation Underscript n equals 1 Overscript upper M Endscripts a left-parenthesis n right-parenthesis y left-parenthesis k minus n right-parenthesis plus e left-parenthesis k right-parenthesis period

      The model assumes that y(k) is obtained by summing up the past values of the sequence plus a modeling error e(k). This error represents the difference between the actual series y(k) and the single‐step prediction

      (3.42)ModifyingAbove y With ampersand c period circ semicolon left-parenthesis k right-parenthesis equals sigma-summation Underscript n equals 1 Overscript upper M Endscripts a left-parenthesis n right-parenthesis y left-parenthesis k minus n right-parenthesis period

      In nonlinear prediction, the model is based on the following nonlinear AR:

      (3.43)y left-parenthesis k right-parenthesis equals g left-parenthesis y left-parenthesis k minus 1 right-parenthesis comma y left-parenthesis k minus 2 right-parenthesis comma ellipsis y left-parenthesis k minus upper M right-parenthesis right-parenthesis plus e left-parenthesis <hr><noindex><a href=Скачать книгу