3 research outputs found

    Easily Computed Lower Bounds on the Information Rate of Intersymbol Interference Channels

    Full text link
    Provable lower bounds are presented for the information rate I(X; X+S+N) where X is the symbol drawn independently and uniformly from a finite-size alphabet, S is a discrete-valued random variable (RV) and N is a Gaussian RV. It is well known that with S representing the precursor intersymbol interference (ISI) at the decision feedback equalizer (DFE) output, I(X; X+S+N) serves as a tight lower bound for the symmetric information rate (SIR) as well as capacity of the ISI channel corrupted by Gaussian noise. When evaluated on a number of well-known finite-ISI channels, these new bounds provide a very similar level of tightness against the SIR to the conjectured lower bound by Shamai and Laroia at all signal-to-noise ratio (SNR) ranges, while being actually tighter when viewed closed up at high SNRs. The new lower bounds are obtained in two steps: First, a "mismatched" mutual information function is introduced which can be proved as a lower bound to I(X; X+S+N). Secondly, this function is further bounded from below by an expression that can be computed easily via a few single-dimensional integrations with a small computational load.Comment: 14 pages, 14 figures including subfigures. arXiv admin note: substantial text overlap with arXiv:1001.391

    Lower Bounds and Approximations for the Information Rate of the ISI Channel

    Full text link
    We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed i.i.d. inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input-output mutual information after application of a MMSE-DFE receiver. A low-SNR expansion is used to prove that the conjectured bound does not hold under general conditions, and to characterize inputs for which it is particularly ill-suited. One such input is used to construct a counterexample, indicating that the Shamai-Laroia expression does not always bound even the achievable rate of the channel, thus excluding a natural relaxation of the original conjectured bound. However, this relaxed bound is then shown to hold for any finite entropy input and ISI channel, when the SNR is sufficiently high. Finally, new simple bounds for the achievable rate are proven, and compared to other known bounds. Information-Estimation relations and estimation-theoretic bounds play a key role in establishing our results.Comment: 21 pages, 4 figure

    Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs

    Full text link
    We compare the maximum achievable rates in single-carrier and OFDM modulation schemes, under the practical assumptions of i.i.d. finite alphabet inputs and linear ISI with additive Gaussian noise. We show that the Shamai-Laroia approximation serves as a bridge between the two rates: while it is well known that this approximation is often a lower bound on the single-carrier achievable rate, it is revealed to also essentially upper bound the OFDM achievable rate. We apply Information-Estimation relations in order to rigorously establish this result for both general input distributions and to sharpen it for commonly used PAM and QAM constellations. To this end, novel bounds on MMSE estimation of PAM inputs to a scalar Gaussian channel are derived, which may be of general interest. Our results show that, under reasonable assumptions, optimal single-carrier schemes may offer spectral efficiency significantly superior to that of OFDM, motivating further research of such systems.Comment: Revised version of IEEE IT submission. Includes new results on uniform input
    corecore