20,412 research outputs found

    Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

    Full text link
    During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-

    Limits on Sparse Data Acquisition: RIC Analysis of Finite Gaussian Matrices

    Full text link
    One of the key issues in the acquisition of sparse data by means of compressed sensing (CS) is the design of the measurement matrix. Gaussian matrices have been proven to be information-theoretically optimal in terms of minimizing the required number of measurements for sparse recovery. In this paper we provide a new approach for the analysis of the restricted isometry constant (RIC) of finite dimensional Gaussian measurement matrices. The proposed method relies on the exact distributions of the extreme eigenvalues for Wishart matrices. First, we derive the probability that the restricted isometry property is satisfied for a given sufficient recovery condition on the RIC, and propose a probabilistic framework to study both the symmetric and asymmetric RICs. Then, we analyze the recovery of compressible signals in noise through the statistical characterization of stability and robustness. The presented framework determines limits on various sparse recovery algorithms for finite size problems. In particular, it provides a tight lower bound on the maximum sparsity order of the acquired data allowing signal recovery with a given target probability. Also, we derive simple approximations for the RICs based on the Tracy-Widom distribution.Comment: 11 pages, 6 figures, accepted for publication in IEEE transactions on information theor

    Aspects of generic entanglement

    Full text link
    We study entanglement and other correlation properties of random states in high-dimensional bipartite systems. These correlations are quantified by parameters that are subject to the "concentration of measure" phenomenon, meaning that on a large-probability set these parameters are close to their expectation. For the entropy of entanglement, this has the counterintuitive consequence that there exist large subspaces in which all pure states are close to maximally entangled. This, in turn, implies the existence of mixed states with entanglement of formation near that of a maximally entangled state, but with negligible quantum mutual information and, therefore, negligible distillable entanglement, secret key, and common randomness. It also implies a very strong locking effect for the entanglement of formation: its value can jump from maximal to near zero by tracing over a number of qubits negligible compared to the size of total system. Furthermore, such properties are generic. Similar phenomena are observed for random multiparty states, leading us to speculate on the possibility that the theory of entanglement is much simplified when restricted to asymptotically generic states. Further consequences of our results include a complete derandomization of the protocol for universal superdense coding of quantum states.Comment: 22 pages, 1 figure, 1 tabl

    Quantum data hiding in the presence of noise

    Get PDF
    When classical or quantum information is broadcast to separate receivers, there exist codes that encrypt the encoded data such that the receivers cannot recover it when performing local operations and classical communication, but they can decode reliably if they bring their systems together and perform a collective measurement. This phenomenon is known as quantum data hiding and hitherto has been studied under the assumption that noise does not affect the encoded systems. With the aim of applying the quantum data hiding effect in practical scenarios, here we define the data-hiding capacity for hiding classical information using a quantum channel. Using this notion, we establish a regularized upper bound on the data hiding capacity of any quantum broadcast channel, and we prove that coherent-state encodings have a strong limitation on their data hiding rates. We then prove a lower bound on the data hiding capacity of channels that map the maximally mixed state to the maximally mixed state (we call these channels "mictodiactic"---they can be seen as a generalization of unital channels when the input and output spaces are not necessarily isomorphic) and argue how to extend this bound to generic channels and to more than two receivers.Comment: 12 pages, accepted for publication in IEEE Transactions on Information Theor
    corecore