2,038 research outputs found

    Unique Information and Secret Key Decompositions

    Full text link
    The unique information (UIUI) is an information measure that quantifies a deviation from the Blackwell order. We have recently shown that this quantity is an upper bound on the one-way secret key rate. In this paper, we prove a triangle inequality for the UIUI, which implies that the UIUI is never greater than one of the best known upper bounds on the two-way secret key rate. We conjecture that the UIUI lower bounds the two-way rate and discuss implications of the conjecture.Comment: 7 page

    Unique Informations and Deficiencies

    Full text link
    Given two channels that convey information about the same random variable, we introduce two measures of the unique information of one channel with respect to the other. The two quantities are based on the notion of generalized weighted Le Cam deficiencies and differ on whether one channel can approximate the other by a randomization at either its input or output. We relate the proposed quantities to an existing measure of unique information which we call the minimum-synergy unique information. We give an operational interpretation of the latter in terms of an upper bound on the one-way secret key rate and discuss the role of the unique informations in the context of nonnegative mutual information decompositions into unique, redundant and synergistic components.Comment: 13 pages, 2 figures. The material in this manuscript was presented at the 56th Annual Allerton Conference on Communication, Control, and Computing, 2018. This manuscript contains some corrections: most notably, Lemma 18 was removed and Proposition 28 was corrected. The numbering of equations and results in this version agrees with the numbering of the published versio

    Unique Information and Secret Key Agreement

    Get PDF
    The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable Xi has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X0 and X1 redundantly share with Y, what X0 uniquely shares with Y, what X1 uniquely shares with Y, and finally what X0 and X1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique informations. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID's meaning---interpretations not present in PID's definition but that, we argue, need to be explicit. These reveal an inconsistency between third-order connected information, two-way secret key agreement rate, and synergy. Similar difficulties arise with a popular PID measure in light the results here as well as from a maximum entropy viewpoint. We close by reviewing the challenges facing the PID.Comment: 9 pages, 3 figures, 4 tables; http://csc.ucdavis.edu/~cmg/compmech/pubs/pid_skar.htm. arXiv admin note: text overlap with arXiv:1808.0860

    Secret Sharing and Shared Information

    Full text link
    Secret sharing is a cryptographic discipline in which the goal is to distribute information about a secret over a set of participants in such a way that only specific authorized combinations of participants together can reconstruct the secret. Thus, secret sharing schemes are systems of variables in which it is very clearly specified which subsets have information about the secret. As such, they provide perfect model systems for information decompositions. However, following this intuition too far leads to an information decomposition with negative partial information terms, which are difficult to interpret. One possible explanation is that the partial information lattice proposed by Williams and Beer is incomplete and has to be extended to incorporate terms corresponding to higher order redundancy. These results put bounds on information decompositions that follow the partial information framework, and they hint at where the partial information lattice needs to be improved.Comment: 9 pages, 1 figure. The material was presented at a Workshop on information decompositions at FIAS, Frankfurt, in 12/2016. The revision includes changes in the definition of combinations of secret sharing schemes. Section 3 and Appendix now discusses in how far existing measures satisfy the proposed properties. The concluding section is considerably revise

    A Perspective on Unique Information: Directionality, Intuitions, and Secret Key Agreement

    Get PDF
    Recently, the partial information decomposition emerged as a promising framework for identifying the meaningful components of the information contained in a joint distribution. Its adoption and practical application, however, have been stymied by the lack of a generally-accepted method of quantifying its components. Here, we briefly discuss the bivariate (two-source) partial information decomposition and two implicitly directional interpretations used to intuitively motivate alternative component definitions. Drawing parallels with secret key agreement rates from information-theoretic cryptography, we demonstrate that these intuitions are mutually incompatible and suggest that this underlies the persistence of competing definitions and interpretations. Having highlighted this hitherto unacknowledged issue, we outline several possible solutions.Comment: 5 pages, 3 tables; http://csc.ucdavis.edu/~cmg/compmech/pubs/pid_intuition.ht

    Unique Information via Dependency Constraints

    Full text link
    The partial information decomposition (PID) is perhaps the leading proposal for resolving information shared between a set of sources and a target into redundant, synergistic, and unique constituents. Unfortunately, the PID framework has been hindered by a lack of a generally agreed-upon, multivariate method of quantifying the constituents. Here, we take a step toward rectifying this by developing a decomposition based on a new method that quantifies unique information. We first develop a broadly applicable method---the dependency decomposition---that delineates how statistical dependencies influence the structure of a joint distribution. The dependency decomposition then allows us to define a measure of the information about a target that can be uniquely attributed to a particular source as the least amount which the source-target statistical dependency can influence the information shared between the sources and the target. The result is the first measure that satisfies the core axioms of the PID framework while not satisfying the Blackwell relation, which depends on a particular interpretation of how the variables are related. This makes a key step forward to a practical PID.Comment: 15 pages, 7 figures, 2 tables, 3 appendices; http://csc.ucdavis.edu/~cmg/compmech/pubs/idep.ht

    Tensor-based trapdoors for CVP and their application to public key cryptography

    Get PDF
    We propose two trapdoors for the Closest-Vector-Problem in lattices (CVP) related to the lattice tensor product. Using these trapdoors we set up a lattice-based cryptosystem which resembles to the McEliece scheme

    Measuring multivariate redundant information with pointwise common change in surprisal

    Get PDF
    The problem of how to properly quantify redundant information is an open question that has been the subject of much recent research. Redundant information refers to information about a target variable S that is common to two or more predictor variables Xi . It can be thought of as quantifying overlapping information content or similarities in the representation of S between the Xi . We present a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level. We provide a game-theoretic operational definition of unique information, and use this to derive constraints which are used to obtain a maximum entropy distribution. Redundancy is then calculated from this maximum entropy distribution by counting only those local co-information terms which admit an unambiguous interpretation as redundant information. We show how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information into redundant, unique and synergistic contributions. We compare our new measure to existing approaches over a range of example systems, including continuous Gaussian variables. Matlab code for the measure is provided, including all considered examples
    corecore