1,668 research outputs found

    A Universal Scheme for Wyner–Ziv Coding of Discrete Sources

    Get PDF
    We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of sliding-window processing followed by Lempel–Ziv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practical WZ coding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes

    Centralized vs Decentralized Multi-Agent Guesswork

    Full text link
    We study a notion of guesswork, where multiple agents intend to launch a coordinated brute-force attack to find a single binary secret string, and each agent has access to side information generated through either a BEC or a BSC. The average number of trials required to find the secret string grows exponentially with the length of the string, and the rate of the growth is called the guesswork exponent. We compute the guesswork exponent for several multi-agent attacks. We show that a multi-agent attack reduces the guesswork exponent compared to a single agent, even when the agents do not exchange information to coordinate their attack, and try to individually guess the secret string using a predetermined scheme in a decentralized fashion. Further, we show that the guesswork exponent of two agents who do coordinate their attack is strictly smaller than that of any finite number of agents individually performing decentralized guesswork.Comment: Accepted at IEEE International Symposium on Information Theory (ISIT) 201

    Information Equation of State

    Full text link
    Landauer's principle is applied to information in the universe. Once stars began forming, the increasing proportion of matter at high stellar temperatures compensated for the expanding universe to provide a near constant information energy density. The information equation of state was close to the dark energy value, w = -1, for a wide range of redshifts, 10> z >0.8, over one half of cosmic time. A reasonable universe information bit content of only 10^87 bits is sufficient for information energy to account for all dark energy. A time varying equation of state with a direct link between dark energy and matter, and linked to star formation in particular, is clearly relevant to the cosmic coincidence problem.In answering the "Why now?" question we wonder "What next?" as we expect the information equation of state to tend towards w = 0 in the future.Comment: 10 pages, 2 figure

    Universal Estimation of Directed Information

    Full text link
    Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments. The first one is a Shannon--McMillan--Breiman type estimator, similar to those used by Verd\'u (2005) and Cai, Kulkarni, and Verd\'u (2006) for estimation of other information measures. We show the almost sure and L1L_1 convergence properties of the estimator for any underlying universal probability assignment. The other three estimators map universal probability assignments to different functionals, each exhibiting relative merits such as smoothness, nonnegativity, and boundedness. We establish the consistency of these estimators in almost sure and L1L_1 senses, and derive near-optimal rates of convergence in the minimax sense under mild conditions. These estimators carry over directly to estimating other information measures of stationary ergodic finite-alphabet processes, such as entropy rate and mutual information rate, with near-optimal performance and provide alternatives to classical approaches in the existing literature. Guided by these theoretical results, the proposed estimators are implemented using the context-tree weighting algorithm as the universal probability assignment. Experiments on synthetic and real data are presented, demonstrating the potential of the proposed schemes in practice and the utility of directed information estimation in detecting and measuring causal influence and delay.Comment: 23 pages, 10 figures, to appear in IEEE Transactions on Information Theor

    The Generalized Area Theorem and Some of its Consequences

    Full text link
    There is a fundamental relationship between belief propagation and maximum a posteriori decoding. The case of transmission over the binary erasure channel was investigated in detail in a companion paper. This paper investigates the extension to general memoryless channels (paying special attention to the binary case). An area theorem for transmission over general memoryless channels is introduced and some of its many consequences are discussed. We show that this area theorem gives rise to an upper-bound on the maximum a posteriori threshold for sparse graph codes. In situations where this bound is tight, the extrinsic soft bit estimates delivered by the belief propagation decoder coincide with the correct a posteriori probabilities above the maximum a posteriori threshold. More generally, it is conjectured that the fundamental relationship between the maximum a posteriori and the belief propagation decoder which was observed for transmission over the binary erasure channel carries over to the general case. We finally demonstrate that in order for the design rate of an ensemble to approach the capacity under belief propagation decoding the component codes have to be perfectly matched, a statement which is well known for the special case of transmission over the binary erasure channel.Comment: 27 pages, 46 ps figure

    Implementation of two-party protocols in the noisy-storage model

    Get PDF
    The noisy-storage model allows the implementation of secure two-party protocols under the sole assumption that no large-scale reliable quantum storage is available to the cheating party. No quantum storage is thereby required for the honest parties. Examples of such protocols include bit commitment, oblivious transfer and secure identification. Here, we provide a guideline for the practical implementation of such protocols. In particular, we analyze security in a practical setting where the honest parties themselves are unable to perform perfect operations and need to deal with practical problems such as errors during transmission and detector inefficiencies. We provide explicit security parameters for two different experimental setups using weak coherent, and parametric down conversion sources. In addition, we analyze a modification of the protocols based on decoy states.Comment: 41 pages, 33 figures, this is a companion paper to arXiv:0906.1030 considering practical aspects, v2: published version, title changed in accordance with PRA guideline

    A measure of statistical complexity based on predictive information with application to finite spin systems

    Get PDF
    NOTICE: this is the author’s version of a work that was accepted for publication in 'Physical Letters A'. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in PHYSICAL LETTERS A, 376 (4): 275-281, JAN 2012. DOI:10.1016/j.physleta.2011.10.066

    Quantum to Classical Randomness Extractors

    Full text link
    The goal of randomness extraction is to distill (almost) perfect randomness from a weak source of randomness. When the source yields a classical string X, many extractor constructions are known. Yet, when considering a physical randomness source, X is itself ultimately the result of a measurement on an underlying quantum system. When characterizing the power of a source to supply randomness it is hence a natural question to ask, how much classical randomness we can extract from a quantum system. To tackle this question we here take on the study of quantum-to-classical randomness extractors (QC-extractors). We provide constructions of QC-extractors based on measurements in a full set of mutually unbiased bases (MUBs), and certain single qubit measurements. As the first application, we show that any QC-extractor gives rise to entropic uncertainty relations with respect to quantum side information. Such relations were previously only known for two measurements. As the second application, we resolve the central open question in the noisy-storage model [Wehner et al., PRL 100, 220502 (2008)] by linking security to the quantum capacity of the adversary's storage device.Comment: 6+31 pages, 2 tables, 1 figure, v2: improved converse parameters, typos corrected, new discussion, v3: new reference
    corecore