6 research outputs found

    A Rate-Splitting Approach to Fading Channels with Imperfect Channel-State Information

    Full text link
    As shown by M\'edard, the capacity of fading channels with imperfect channel-state information (CSI) can be lower-bounded by assuming a Gaussian channel input XX with power PP and by upper-bounding the conditional entropy h(X∣Y,H^)h(X|Y,\hat{H}) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating XX from (Y,H^)(Y,\hat{H}). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input XX as the sum of two independent Gaussian variables X1X_1 and X2X_2 and by applying M\'edard's lower bound first to bound the mutual information between X1X_1 and YY while treating X2X_2 as noise, and by applying it a second time to the mutual information between X2X_2 and YY while assuming X1X_1 to be known, we obtain a capacity lower bound that is strictly larger than M\'edard's lower bound. We then generalize this approach to an arbitrary number LL of layers, where XX is expressed as the sum of LL independent Gaussian random variables of respective variances PℓP_{\ell}, ℓ=1,…,L\ell = 1,\dotsc,L summing up to PP. Among all such rate-splitting bounds, we determine the supremum over power allocations PℓP_\ell and total number of layers LL. This supremum is achieved for L→∞L\to\infty and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H−H^H-\hat{H} tends to zero as the SNR tends to infinity.Comment: 28 pages, 8 figures, submitted to IEEE Transactions on Information Theory. Revised according to first round of review

    Generalized Nearest Neighbor Decoding

    Full text link
    It is well known that for Gaussian channels, a nearest neighbor decoding rule, which seeks the minimum Euclidean distance between a codeword and the received channel output vector, is the maximum likelihood solution and hence capacity-achieving. Nearest neighbor decoding remains a convenient and yet mismatched solution for general channels, and the key message of this paper is that the performance of the nearest neighbor decoding can be improved by generalizing its decoding metric to incorporate channel state dependent output processing and codeword scaling. Using generalized mutual information, which is a lower bound to the mismatched capacity under independent and identically distributed codebook ensemble, as the performance measure, this paper establishes the optimal generalized nearest neighbor decoding rule, under Gaussian channel input. Several {restricted forms of the} generalized nearest neighbor decoding rule are also derived and compared with existing solutions. The results are illustrated through several case studies for fading channels with imperfect receiver channel state information and for channels with quantization effects.Comment: 30 pages, 8 figure

    A rate-splitting approach to fading channels with imperfect channel-state information

    No full text
    As shown by Médard, the capacity of fading channels with imperfect channel-state information can be lower-bounded by assuming a Gaussian channel input X with power P and by upper-bounding the conditional entropy h(X|Y,H) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating X from \(Y, H). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input X as the sum of two independent Gaussian variables X1 and X2 and by applying Médard's lower bound first to bound the mutual information between X1 and Y while treating X2 as noise, and by applying it a second time to the mutual information between X2 and Y while assuming X1 to be known, we obtain a capacity lower bound that is strictly larger than Médard's lower bound. We then generalize this approach to an arbitrary number L of layers, where X is expressed as the sum of L independent Gaussian random variables of respective variances Pl, l = 1, ¿ ,L summing up to P. Among all such rate-splitting bounds, we determine the supremum over power allocations Pl and total number of layers L. This supremum is achieved for L 8 and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H-H tends to zero as the SNR tends to infinity.Peer Reviewe

    A Rate-Splitting Approach to Fading Channels With Imperfect Channel-State Information

    No full text

    Communication rates for fading channels with imperfect channel-state information

    Get PDF
    The present thesis studies information rates for reliable transmission of information over fading channels in the realistic situation where the receiver has only imperfect channel-state knowledge. Of particular interest are analytical expressions of achievable transmission rates under imperfect and no CSI, that is, lower bounds on the mutual information and on the Shannon capacity. A well-known mutual information lower bound for Gaussian codebooks is obtained when conflating the additive (thermal) noise with the multiplicative noise due to the imperfections of the CSIR into a single effective noise term, and then assuming that this term is independent Gaussian. This so-called worst-case-noise approach allows to derive a strikingly simple and well-known lower bound on the mutual information of the channel. A first part of this thesis proposes a simple way to improve this worst-case-noise bound by means of a rate-splitting approach: by expressing the Gaussian input as a sum of several independent Gaussian inputs, and by assuming that the receiver performs successive decoding of the corresponding information streams, we show how to derive a larger mutual information lower bound. On channels with a single transmit antenna, the optimal allocation of transmit power across the different inputs is found to be approached as the number of inputs (so-called layers) tends to infinity, and the power assigned to each layer tends to zero. This infinite-layering limit gives rise to a mutual information bound expressible as an integral. On channels with multiple transmit antennas, an analogous result is derived. However, since multiple transmit antennas open up more possibilities for spatial multiplexing, the rate-splitting approach gives rise to a whole family of infinite-layering bounds. This family of bounds is closely studied for independent and identically zero-mean Gaussian distributed fading coefficients (so-called i.i.d. Rayleigh fading). Most notably, it is shown that for asymptotically perfect CSIR, any bound from the family is asymptotically tight at high signal-to-noise ratios (SNR). Specifically, this means that the difference between the mutual information and its lower bound tends to zero as the SNR tends to infinity, provided that the CSIR tends to be exact as the SNR tends to infinity. A second part of this thesis proposes a framework for the optimization of a class of utility functions in black-Rayleigh fading multiple-antenna channels with transmit-side antenna correlation, and no CSI at the receiver. A fraction of each fading block is reserved for transmitting a sequence of training symbols, while the remaining time instants are used for transmission of data. The receiver estimates the channel matrix based on the noisy training observation and then decodes the data signal using this channel estimate. For utilities that are symmetric functions of the eigenvalues of the matrix-valued effective SNR (such as, e.g., the worst-case-noise bound), the problems consisting in optimizing the pilot sequence and the linear precoder are cast into convex (or quasi-convex) problems for concave (or quasi-concave) utility functions. We also study an important subproblem of the joint optimization, which consists in computing jointly Pareto-optimal pilot sequences and precoders. By wrapping these optimization procedures into a cyclic iteration, we obtain an algorithm which converges to a local joint optimum for any utility.Aquesta tesi estudia les taxes d'informació per la transmissió fiable d'informació en canals amb esvaïments sota la hipòtesi realista de que el receptor té un coneixement tan sols imperfecte de l'esvaïment aleatori. De particular interès són les expressions analítiques de les taxes de transmissió assolibles amb coneixement imperfecte i sense coneixement de l'estat del canal, és a dir, cotes inferiors de la informació mútua i de la capacitat de Shannon. Una cota inferior de la informació mútua per a codis gaussians ben coneguda s'obté combinant el soroll additiu (tèrmic) amb el terme de soroll multiplicatiu causat per les imperfeccions del coneixement de l'estat del canal en un únic soroll efectiu, i assumint que el soroll és gaussià i independent. Aquesta aproximació del pitjor soroll permet obtenir una expressió molt simple i ben coneguda de la informació mútua del canal. Una primera part d'aquesta tesi proposa un procediment senzill per a millorar aquesta cota associada al pitjor cas mitjançant una estratègia de repartiment de taxa: expressant l'entrada gaussiana del canal com a la suma de diverses entrades gaussianes independents i suposant que el receptor realitza una descodificació seqüencial dels fluxos d'informació, es mostra com obtenir una major cota inferior de la informació mútua del canal. En canals amb una única antena en transmissió, la distribució òptima de potència als diferents fluxos s'obté quan el seu nombre (capes) tendeix a infinit, i la potència associada a cada capa tendeix a zero. El límit associat a un nombre infinit de capes dóna lloc a una expressió integral de la cota de la informació mútua. En canals amb múltiples antenes s'obté un resultat similar. No obstant això, atès que la utilització de múltiples antenes proporciona més possibilitats de multiplexat espacial, el procediment dóna lloc a tota una família de cotes inferiors de la informació mútua associades a una combinació de capes infinita. S'estudia en detall aquesta família de cotes per al cas de coeficients d'esvaïments gaussians de mitjana zero, independents i idènticament distribuïts (conegut com esvaïment i.i.d. Rayleigh). S'obtenen diverses propietats de la família de cotes. És important destacar que per a coneixement asimptòtic perfecte del canal en recepció, qualsevol membre de la família de cotes és asimptòticament ajustat per alta relació senyal a soroll (SNR). En concret, la diferència entre la informació mútua i la seva cota inferior tendeix a zero quan la SNR tendeix a infinit sempre que el coneixement del canal tendeixi a ser exacte a mesura que la SNR tendeix a infinit. Una segona part d'aquesta tesi proposa un marc per a l'optimització d'una classe de funcions d'utilitat en canals amb múltiples antenes i esvaïments Rayleigh per blocs amb correlació en transmissió i sense informació sobre el canal a recepció. Una fracció temporal de cada bloc d'esvaïment es reserva per transmetre una seqüència de símbols d'entrenament mentre que la resta de mostres temporals s'utilitzen per transmetre informació. El receptor estima la matriu del canal partint de la seva observació sorollosa i descodifica la informació mitjançant la seva estimació del canal. Per a una classe de funcions d'utilitat que són funcions simètriques dels autovalors de la SNR matricial efectiva, els problemes consistents en optimitzar la seqüència pilot i el precodificador lineal són transformats en problemes convexos (o quasi-convexos) per a funcions d'utilitat còncaves (o quasi-còncaves). També s'estudia un subproblema important de l'optimització conjunta, que consisteix en el càlcul de les seqüències d'entrenament i dels precodificadors conjuntament Pareto-òptims. Integrant aquests procediments d'optimització en una iteració cíclica, s'obté un algoritme que convergeix a un òptim local conjunt per a qualsevol utilitat quasi-còncav
    corecore