9 research outputs found
An operational information decomposition via synergistic disclosure
Abstract: Multivariate information decompositions hold promise to yield insight into complex systems, and stand out for their ability to identify synergistic phenomena. However, the adoption of these approaches has been hindered by there being multiple possible decompositions, and no precise guidance for preferring one over the others. At the heart of this disagreement lies the absence of a clear operational interpretation of what synergistic information is. Here we fill this gap by proposing a new information decomposition based on a novel operationalisation of informational synergy, which leverages recent developments in the literature of data privacy. Our decomposition is defined for any number of information sources, and its atoms can be calculated using elementary optimisation techniques. The decomposition provides a natural coarse-graining that scales gracefully with the system’s size, and is applicable in a wide range of scenarios of practical interest
System Information Decomposition
In order to characterize complex higher-order interactions among variables in
a system, we introduce a new framework for decomposing the information entropy
of variables in a system, termed System Information Decomposition (SID).
Diverging from Partial Information Decomposition (PID) correlation methods,
which quantify the interaction between a single target variable and a
collection of source variables, SID extends those approaches by equally
examining the interactions among all system variables. Specifically, we
establish the robustness of the SID framework by proving all the information
atoms are symmetric, which detaches the unique, redundant, and synergistic
information from the specific target variable, empowering them to describe the
relationship among variables. Additionally, we analyze the relationship between
SID and existing information measures and propose several properties that SID
quantitative methods should follow. Furthermore, by employing an illustrative
example, we demonstrate that SID uncovers a higher-order interaction
relationships among variables that cannot be captured by current measures of
probability and information and provide two approximate calculation methods
verified by this case. This advance in higher-order measures enables SID to
explain why Holism posits that some systems cannot be decomposed without loss
of characteristics under existing measures, and offers a potential quantitative
framework for higher-order relationships across a broad spectrum of
disciplines
Recommended from our members
An operational information decomposition via synergistic disclosure
Abstract
Multivariate information decompositions hold promise to yield insight into complex systems, and stand out for their ability to identify synergistic phenomena. However, the adoption of these approaches has been hindered by there being multiple possible decompositions, and no precise guidance for preferring one over the others. At the heart of this disagreement lies the absence of a clear operational interpretation of what synergistic information is. Here we fill this gap by proposing a new information decomposition based on a novel operationalisation of informational synergy, which leverages recent developments in the literature of data privacy. Our decomposition is defined for any number of information sources, and its atoms can be calculated using elementary optimisation techniques. The decomposition provides a natural coarse-graining that scales gracefully with the system’s size, and is applicable in a wide range of scenarios of practical interest.</jats:p
Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data.
The broad concept of emergence is instrumental in various of the most challenging open scientific questions-yet, few quantitative theories of what constitutes emergent phenomena have been proposed. This article introduces a formal theory of causal emergence in multivariate systems, which studies the relationship between the dynamics of parts of a system and macroscopic features of interest. Our theory provides a quantitative definition of downward causation, and introduces a complementary modality of emergent behaviour-which we refer to as causal decoupling. Moreover, the theory allows practical criteria that can be efficiently calculated in large systems, making our framework applicable in a range of scenarios of practical interest. We illustrate our findings in a number of case studies, including Conway's Game of Life, Reynolds' flocking model, and neural activity as measured by electrocorticography
Data disclosure under perfect sample privacy
Perfect data privacy seems to be in fundamental opposition to the economical and scientific opportunities associated with extensive data exchange. Defying this intuition, this paper develops a framework that allows the disclosure of collective properties of datasets without compromising the privacy of individual data samples. We present an algorithm to build an optimal disclosure strategy/mapping, and discuss it fundamental limits on finite and asymptotically large datasets. Furthermore, we present explicit expressions to the asymptotic performance of this scheme in some scenarios, and study cases where our approach attains maximal efficiency. We finally discuss suboptimal schemes to provide sample privacy guarantees to large datasets with a reduced computational cost