182 research outputs found

    The chain rule implies Tsirelson's bound: an approach from generalized mutual information

    Full text link
    In order to analyze an information theoretical derivation of Tsirelson's bound based on information causality, we introduce a generalized mutual information (GMI), defined as the optimal coding rate of a channel with classical inputs and general probabilistic outputs. In the case where the outputs are quantum, the GMI coincides with the quantum mutual information. In general, the GMI does not necessarily satisfy the chain rule. We prove that Tsirelson's bound can be derived by imposing the chain rule on the GMI. We formulate a principle, which we call the no-supersignalling condition, which states that the assistance of nonlocal correlations does not increase the capability of classical communication. We prove that this condition is equivalent to the no-signalling condition. As a result, we show that Tsirelson's bound is implied by the nonpositivity of the quantitative difference between information causality and no-supersignalling.Comment: 23 pages, 8 figures, Added Section 2 and Appendix B, result unchanged, Added reference

    Performance Metrics for Systems with Soft-Decision FEC and Probabilistic Shaping

    Full text link
    High-throughput optical communication systems utilize binary soft-decision forward error correction (SD-FEC) with bit interleaving over the bit channels. The generalized mutual information (GMI) is an achievable information rate (AIR) in such systems and is known to be a good predictor of the bit error rate after SD-FEC decoding (post-FEC BER) for uniform signaling. However, for probabilistically shaped (nonuniform) signaling, we find that the normalized AIR, defined as the AIR divided by the signal entropy, is less correlated with the post-FEC BER. We show that the information quantity based on the distribution of the single bit signal, and its asymmetric loglikelihood ratio, are better predictors of the post-FEC BER. In simulations over the Gaussian channel, we find that the prediction accuracy, quantified as the peak-to-peak deviation of the post-FEC BER within a set of different modulation formats and distributions, can be improved more than 10 times compared with the normalized AIR.Comment: 4 pages, 3 figure

    A Mixed-ADC Receiver Architecture for Massive MIMO Systems

    Full text link
    Motivated by the demand for energy-efficient communication solutions in the next generation cellular network, a mixed-ADC receiver architecture for massive multiple input multiple output (MIMO) systems is proposed, which differs from previous works in that herein one-bit analog-to-digital converters (ADCs) partially replace the conventionally assumed high-resolution ADCs. The information-theoretic tool of generalized mutual information (GMI) is exploited to analyze the achievable data rates of the proposed system architecture and an array of analytical results of engineering interest are obtained. For deterministic single input multiple output (SIMO) channels, a closed-form expression of the GMI is derived, based on which the linear combiner is optimized. Then, the asymptotic behaviors of the GMI in both low and high SNR regimes are explored, and the analytical results suggest a plausible ADC assignment scheme. Finally, the analytical framework is applied to the multi-user access scenario, and the corresponding numerical results demonstrate that the mixed system architecture with a relatively small number of high-resolution ADCs is able to achieve a large fraction of the channel capacity without output quantization.Comment: 5 pages, 5 figures, to appear in IEEE Information Theory Workshop (ITW2015

    On the Information Loss of the Max-Log Approximation in BICM Systems

    Full text link
    We present a comprehensive study of the information rate loss of the max-log approximation for MM-ary pulse-amplitude modulation (PAM) in a bit-interleaved coded modulation (BICM) system. It is widely assumed that the calculation of L-values using the max-log approximation leads to an information loss. We prove that this assumption is correct for all MM-PAM constellations and labelings with the exception of a symmetric 4-PAM constellation labeled with a Gray code. We also show that for max-log L-values, the BICM generalized mutual information (GMI), which is an achievable rate for a standard BICM decoder, is too pessimistic. In particular, it is proved that the so-called "harmonized" GMI, which can be seen as the sum of bit-level GMIs, is achievable without any modifications to the decoder. We then study how bit-level channel symmetrization and mixing affect the mutual information (MI) and the GMI for max-log L-values. Our results show that these operations, which are often used when analyzing BICM systems, preserve the GMI. However, this is not necessarily the case when the MI is considered. Necessary and sufficient conditions under which these operations preserve the MI are provided

    A Simple Approximation for the Bit-Interleaved Coded Modulation Capacity

    Get PDF
    The generalized mutual information (GMI) is an achievable rate for bit-interleaved coded modulation (BICM) and is highly dependent on the binary labeling of the constellation. The BICM-GMI, sometimes called the BICM capacity, can be evaluated numerically. This approach, however, becomes impractical when the number of constellation points and/or the constellation dimensionality grows, or when many different labelings are considered. A simple approximation for the BICM-GMI based on the area theorem of the demapper's extrinsic information transfer (EXIT) function is proposed. Numerical results show the proposed approximation gives good estimates of the BICM-GMI for labelings with close to linear EXIT functions, which includes labelings of common interest, such as the natural binary code, binary reflected Gray code, etc. This approximation is used to optimize the binary labeling of the 32-APSK constellation defined in the DVB-S2 standard. Gains of approximately 0.15 dB are obtained

    Case Report: Generalized Mutual Information (GMI) Analysis of Sensory Motor Rhythm in a Subject Affected by Facioscapulohumeral Muscular Dystrophy after Ken Ware Treatment

    Get PDF
    In this case report we study the dynamics of the SMR band in a subject affected from Facioscapulohumeral Muscular Dystrophy and subjected to Ken Ware Neuro Physics treatment. We use the Generalized Mutual Information (GMI) to analyze in detail the SMR band at rest during the treatment. Brain dynamics responds to a chaotic-deterministic regime with a complex behaviour that constantly self-rearranges and self-organizes such dynamics in function of the outside requirements. We demonstrate that the SMR chaotic dynamics responds directly to such regime and that also decreasing in EEG during muscular activity really increases its ability of self-arrangement and self-organization in brain. The proposed novel method of the GMI is arranged by us so that it may be used in several cases of clinical interest. In the case of muscular dystrophy here examined, GMI enables us to quantify with accuracy the improvement that the subject realizes during such treatment
    • …
    corecore