2,174 research outputs found

    Relations between random coding exponents and the statistical physics of random codes

    Full text link
    The partition function pertaining to finite--temperature decoding of a (typical) randomly chosen code is known to have three types of behavior, corresponding to three phases in the plane of rate vs. temperature: the {\it ferromagnetic phase}, corresponding to correct decoding, the {\it paramagnetic phase}, of complete disorder, which is dominated by exponentially many incorrect codewords, and the {\it glassy phase} (or the condensed phase), where the system is frozen at minimum energy and dominated by subexponentially many incorrect codewords. We show that the statistical physics associated with the two latter phases are intimately related to random coding exponents. In particular, the exponent associated with the probability of correct decoding at rates above capacity is directly related to the free energy in the glassy phase, and the exponent associated with probability of error (the error exponent) at rates below capacity, is strongly related to the free energy in the paramagnetic phase. In fact, we derive alternative expressions of these exponents in terms of the corresponding free energies, and make an attempt to obtain some insights from these expressions. Finally, as a side result, we also compare the phase diagram associated with a simple finite-temperature universal decoder for discrete memoryless channels, to that of the finite--temperature decoder that is aware of the channel statistics.Comment: 26 pages, 2 figures, submitted to IEEE Transactions on Information Theor

    Error exponents of typical random codes

    Full text link
    We define the error exponent of the typical random code as the long-block limit of the negative normalized expectation of the logarithm of the error probability of the random code, as opposed to the traditional random coding error exponent, which is the limit of the negative normalized logarithm of the expectation of the error probability. For the ensemble of uniformly randomly drawn fixed composition codes, we provide exact error exponents of typical random codes for a general discrete memoryless channel (DMC) and a wide class of (stochastic) decoders, collectively referred to as the generalized likelihood decoder (GLD). This ensemble of fixed composition codes is shown to be no worse than any other ensemble of independent codewords that are drawn under a permutation--invariant distribution (e.g., i.i.d. codewords). We also present relationships between the error exponent of the typical random code and the ordinary random coding error exponent, as well as the expurgated exponent for the GLD. Finally, we demonstrate that our analysis technique is applicable also to more general communication scenarios, such as list decoding (for fixed-size lists) as well as decoding with an erasure/list option in Forney's sense.Comment: 26 pages, submitted for publicatio
    • …
    corecore