103 research outputs found

    Limit theorems for the sample entropy of hidden Markov chains

    Get PDF
    The Shannon-McMillan-Breiman theorem asserts that the sample entropy of a stationary and ergodic stochastic process converges to the entropy rate of the same process (as the sample size tends to infinity) almost surely. In this paper, we restrict our attention to the convergence behavior of the sample entropy of hidden Markov chains. Under certain positivity assumptions, we prove that a central limit theorem (CLT) with some Berry-Esseen bound for the sample entropy of a hidden Markov chain, and we use this CLT to establish a law of iterated logarithm (LIL) for the sample entropy. © 2011 IEEE.published_or_final_versionThe 2011 IEEE International Symposium on Information Theory (ISIT), St. Petersburg, Russia, 31 July-5 August 2011. In Proceedings of ISIT, 2011, p. 3009-301

    Achieving the Capacity of any DMC using only Polar Codes

    Full text link
    We construct a channel coding scheme to achieve the capacity of any discrete memoryless channel based solely on the techniques of polar coding. In particular, we show how source polarization and randomness extraction via polarization can be employed to "shape" uniformly-distributed i.i.d. random variables into approximate i.i.d. random variables distributed ac- cording to the capacity-achieving distribution. We then combine this shaper with a variant of polar channel coding, constructed by the duality with source coding, to achieve the channel capacity. Our scheme inherits the low complexity encoder and decoder of polar coding. It differs conceptually from Gallager's method for achieving capacity, and we discuss the advantages and disadvantages of the two schemes. An application to the AWGN channel is discussed.Comment: 9 pages, 7 figure

    Decoding Cyclic Codes up to a New Bound on the Minimum Distance

    Full text link
    A new lower bound on the minimum distance of q-ary cyclic codes is proposed. This bound improves upon the Bose-Chaudhuri-Hocquenghem (BCH) bound and, for some codes, upon the Hartmann-Tzeng (HT) bound. Several Boston bounds are special cases of our bound. For some classes of codes the bound on the minimum distance is refined. Furthermore, a quadratic-time decoding algorithm up to this new bound is developed. The determination of the error locations is based on the Euclidean Algorithm and a modified Chien search. The error evaluation is done by solving a generalization of Forney's formula

    Upper Bounds on the Capacity of Binary Channels with Causal Adversaries

    Full text link
    In this work we consider the communication of information in the presence of a causal adversarial jammer. In the setting under study, a sender wishes to communicate a message to a receiver by transmitting a codeword (x1,...,xn)(x_1,...,x_n) bit-by-bit over a communication channel. The sender and the receiver do not share common randomness. The adversarial jammer can view the transmitted bits xix_i one at a time, and can change up to a pp-fraction of them. However, the decisions of the jammer must be made in a causal manner. Namely, for each bit xix_i the jammer's decision on whether to corrupt it or not must depend only on xjx_j for jij \leq i. This is in contrast to the "classical" adversarial jamming situations in which the jammer has no knowledge of (x1,...,xn)(x_1,...,x_n), or knows (x1,...,xn)(x_1,...,x_n) completely. In this work, we present upper bounds (that hold under both the average and maximal probability of error criteria) on the capacity which hold for both deterministic and stochastic encoding schemes.Comment: To appear in the IEEE Transactions on Information Theory; shortened version appeared at ISIT 201

    S-AMP: Approximate Message Passing for General Matrix Ensembles

    Get PDF
    In this work we propose a novel iterative estimation algorithm for linear observation systems called S-AMP whose fixed points are the stationary points of the exact Gibbs free energy under a set of (first- and second-) moment consistency constraints in the large system limit. S-AMP extends the approximate message-passing (AMP) algorithm to general matrix ensembles. The generalization is based on the S-transform (in free probability) of the spectrum of the measurement matrix. Furthermore, we show that the optimality of S-AMP follows directly from its design rather than from solving a separate optimization problem as done for AMP.Comment: 5 pages, 1 figur

    Cognitive Wyner Networks with Clustered Decoding

    Full text link
    We study an interference network where equally-numbered transmitters and receivers lie on two parallel lines, each transmitter opposite its intended receiver. We consider two short-range interference models: the "asymmetric network," where the signal sent by each transmitter is interfered only by the signal sent by its left neighbor (if present), and a "symmetric network," where it is interfered by both its left and its right neighbors. Each transmitter is cognizant of its own message, the messages of the tt_\ell transmitters to its left, and the messages of the trt_r transmitters to its right. Each receiver decodes its message based on the signals received at its own antenna, at the rr_\ell receive antennas to its left, and the rrr_r receive antennas to its right. For such networks we provide upper and lower bounds on the multiplexing gain, i.e., on the high-SNR asymptotic logarithmic growth of the sum-rate capacity. In some cases our bounds meet, e.g., for the asymmetric network. Our results exhibit an equivalence between the transmitter side-information parameters t,trt_\ell, t_r and the receiver side-information parameters r,rrr_\ell, r_r in the sense that increasing/decreasing tt_\ell or trt_r by a positive integer δ\delta has the same effect on the multiplexing gain as increasing/decreasing rr_\ell or rrr_r by δ\delta. Moreover---even in asymmetric networks---there is an equivalence between the left side-information parameters t,rt_\ell, r_\ell and the right side-information parameters tr,rrt_r, r_r.Comment: Second revision submitted to IEEE Transactions on Information Theor
    corecore