77 research outputs found
Quantifying unique information
We propose new measures of shared information, unique information and
synergistic information that can be used to decompose the multi-information of
a pair of random variables with a third random variable . Our
measures are motivated by an operational idea of unique information which
suggests that shared information and unique information should depend only on
the pair marginal distributions of and . Although this
invariance property has not been studied before, it is satisfied by other
proposed measures of shared information. The invariance property does not
uniquely determine our new measures, but it implies that the functions that we
define are bounds to any other measures satisfying the same invariance
property. We study properties of our measures and compare them to other
candidate measures.Comment: 24 pages, 2 figures. Version 2 contains less typos than version
A Perspective on Unique Information: Directionality, Intuitions, and Secret Key Agreement
Recently, the partial information decomposition emerged as a promising
framework for identifying the meaningful components of the information
contained in a joint distribution. Its adoption and practical application,
however, have been stymied by the lack of a generally-accepted method of
quantifying its components. Here, we briefly discuss the bivariate (two-source)
partial information decomposition and two implicitly directional
interpretations used to intuitively motivate alternative component definitions.
Drawing parallels with secret key agreement rates from information-theoretic
cryptography, we demonstrate that these intuitions are mutually incompatible
and suggest that this underlies the persistence of competing definitions and
interpretations. Having highlighted this hitherto unacknowledged issue, we
outline several possible solutions.Comment: 5 pages, 3 tables;
http://csc.ucdavis.edu/~cmg/compmech/pubs/pid_intuition.ht
Unique Information and Secret Key Agreement
The partial information decomposition (PID) is a promising framework for
decomposing a joint random variable into the amount of influence each source
variable Xi has on a target variable Y, relative to the other sources. For two
sources, influence breaks down into the information that both X0 and X1
redundantly share with Y, what X0 uniquely shares with Y, what X1 uniquely
shares with Y, and finally what X0 and X1 synergistically share with Y.
Unfortunately, considerable disagreement has arisen as to how these four
components should be quantified. Drawing from cryptography, we consider the
secret key agreement rate as an operational method of quantifying unique
informations. Secret key agreement rate comes in several forms, depending upon
which parties are permitted to communicate. We demonstrate that three of these
four forms are inconsistent with the PID. The remaining form implies certain
interpretations as to the PID's meaning---interpretations not present in PID's
definition but that, we argue, need to be explicit. These reveal an
inconsistency between third-order connected information, two-way secret key
agreement rate, and synergy. Similar difficulties arise with a popular PID
measure in light the results here as well as from a maximum entropy viewpoint.
We close by reviewing the challenges facing the PID.Comment: 9 pages, 3 figures, 4 tables;
http://csc.ucdavis.edu/~cmg/compmech/pubs/pid_skar.htm. arXiv admin note:
text overlap with arXiv:1808.0860
Unique Information via Dependency Constraints
The partial information decomposition (PID) is perhaps the leading proposal
for resolving information shared between a set of sources and a target into
redundant, synergistic, and unique constituents. Unfortunately, the PID
framework has been hindered by a lack of a generally agreed-upon, multivariate
method of quantifying the constituents. Here, we take a step toward rectifying
this by developing a decomposition based on a new method that quantifies unique
information. We first develop a broadly applicable method---the dependency
decomposition---that delineates how statistical dependencies influence the
structure of a joint distribution. The dependency decomposition then allows us
to define a measure of the information about a target that can be uniquely
attributed to a particular source as the least amount which the source-target
statistical dependency can influence the information shared between the sources
and the target. The result is the first measure that satisfies the core axioms
of the PID framework while not satisfying the Blackwell relation, which depends
on a particular interpretation of how the variables are related. This makes a
key step forward to a practical PID.Comment: 15 pages, 7 figures, 2 tables, 3 appendices;
http://csc.ucdavis.edu/~cmg/compmech/pubs/idep.ht
The Blackwell relation defines no lattice
Blackwell's theorem shows the equivalence of two preorders on the set of
information channels. Here, we restate, and slightly generalize, his result in
terms of random variables. Furthermore, we prove that the corresponding partial
order is not a lattice; that is, least upper bounds and greatest lower bounds
do not exist.Comment: 5 pages, 1 figur
Unique Information and Secret Key Decompositions
The unique information () is an information measure that quantifies a
deviation from the Blackwell order. We have recently shown that this quantity
is an upper bound on the one-way secret key rate. In this paper, we prove a
triangle inequality for the , which implies that the is never greater
than one of the best known upper bounds on the two-way secret key rate. We
conjecture that the lower bounds the two-way rate and discuss implications
of the conjecture.Comment: 7 page
Secret Sharing and Shared Information
Secret sharing is a cryptographic discipline in which the goal is to
distribute information about a secret over a set of participants in such a way
that only specific authorized combinations of participants together can
reconstruct the secret. Thus, secret sharing schemes are systems of variables
in which it is very clearly specified which subsets have information about the
secret. As such, they provide perfect model systems for information
decompositions. However, following this intuition too far leads to an
information decomposition with negative partial information terms, which are
difficult to interpret. One possible explanation is that the partial
information lattice proposed by Williams and Beer is incomplete and has to be
extended to incorporate terms corresponding to higher order redundancy. These
results put bounds on information decompositions that follow the partial
information framework, and they hint at where the partial information lattice
needs to be improved.Comment: 9 pages, 1 figure. The material was presented at a Workshop on
information decompositions at FIAS, Frankfurt, in 12/2016. The revision
includes changes in the definition of combinations of secret sharing schemes.
Section 3 and Appendix now discusses in how far existing measures satisfy the
proposed properties. The concluding section is considerably revise
Computing the Unique Information
Given a pair of predictor variables and a response variable, how much
information do the predictors have about the response, and how is this
information distributed between unique, redundant, and synergistic components?
Recent work has proposed to quantify the unique component of the decomposition
as the minimum value of the conditional mutual information over a constrained
set of information channels. We present an efficient iterative divergence
minimization algorithm to solve this optimization problem with convergence
guarantees and evaluate its performance against other techniques.Comment: To appear in 2018 IEEE International Symposium on Information Theory
(ISIT); 18 pages; 4 figures, 1 Table; Github link to source code:
https://github.com/infodeco/computeU
- …