2 research outputs found

    Feedback Capacity of the First-Order Moving Average Gaussian Channel

    Full text link
    The feedback capacity of the stationary Gaussian additive noise channel has been open, except for the case where the noise is white. Here we find the feedback capacity of the stationary first-order moving average additive Gaussian noise channel in closed form. Specifically, the channel is given by Yi=Xi+Zi,Y_i = X_i + Z_i, i=1,2,...,i = 1, 2, ..., where the input {Xi}\{X_i\} satisfies a power constraint and the noise {Zi}\{Z_i\} is a first-order moving average Gaussian process defined by Zi=Ξ±Uiβˆ’1+Ui,Z_i = \alpha U_{i-1} + U_i, βˆ£Ξ±βˆ£β‰€1,|\alpha| \le 1, with white Gaussian innovations Ui,U_i, i=0,1,....i = 0,1,.... We show that the feedback capacity of this channel is βˆ’log⁑x0,-\log x_0, where x0x_0 is the unique positive root of the equation ρx2=(1βˆ’x2)(1βˆ’βˆ£Ξ±βˆ£x)2, \rho x^2 = (1-x^2) (1 - |\alpha|x)^2, and ρ\rho is the ratio of the average input power per transmission to the variance of the noise innovation UiU_i. The optimal coding scheme parallels the simple linear signalling scheme by Schalkwijk and Kailath for the additive white Gaussian noise channel -- the transmitter sends a real-valued information-bearing signal at the beginning of communication and subsequently refines the receiver's error by processing the feedback noise signal through a linear stationary first-order autoregressive filter. The resulting error probability of the maximum likelihood decoding decays doubly-exponentially in the duration of the communication. This feedback capacity of the first-order moving average Gaussian channel is very similar in form to the best known achievable rate for the first-order \emph{autoregressive} Gaussian noise channel studied by Butman, Wolfowitz, and Tiernan, although the optimality of the latter is yet to be established.Comment: Updated version, 36 pages, 4 figures, submitted to IEEE Trans. Inform. Theor

    Concatenated Coding for the AWGN Channel with Noisy Feedback

    Full text link
    The use of open-loop coding can be easily extended to a closed-loop concatenated code if the channel has access to feedback. This can be done by introducing a feedback transmission scheme as an inner code. In this paper, this process is investigated for the case when a linear feedback scheme is implemented as an inner code and, in particular, over an additive white Gaussian noise (AWGN) channel with noisy feedback. To begin, we look to derive an optimal linear feedback scheme by optimizing over the received signal-to-noise ratio. From this optimization, an asymptotically optimal linear feedback scheme is produced and compared to other well-known schemes. Then, the linear feedback scheme is implemented as an inner code to a concatenated code over the AWGN channel with noisy feedback. This code shows improvements not only in error exponent bounds, but also in bit-error-rate and frame-error-rate. It is also shown that the if the concatenated code has total blocklength L and the inner code has blocklength, N, the inner code blocklength should scale as N = O(C/R), where C is the capacity of the channel and R is the rate of the concatenated code. Simulations with low density parity check (LDPC) and turbo codes are provided to display practical applications and their error rate benefits.Comment: Accepted to IEEE Trans. on Information Theory, January 201
    corecore