1,991 research outputs found
Entropic Dynamics
Entropic Dynamics is a framework in which dynamical laws are derived as an
application of entropic methods of inference. No underlying action principle is
postulated. Instead, the dynamics is driven by entropy subject to the
constraints appropriate to the problem at hand. In this paper we review three
examples of entropic dynamics. First we tackle the simpler case of a standard
diffusion process which allows us to address the central issue of the nature of
time. Then we show that imposing the additional constraint that the dynamics be
non-dissipative leads to Hamiltonian dynamics. Finally, considerations from
information geometry naturally lead to the type of Hamiltonian that describes
quantum theory.Comment: Invited contribution to the Entropy special volume on Dynamical
Equations and Causal Structures from Observation
The Inflation Technique for Causal Inference with Latent Variables
The problem of causal inference is to determine if a given probability
distribution on observed variables is compatible with some causal structure.
The difficult case is when the causal structure includes latent variables. We
here introduce the for tackling this problem. An
inflation of a causal structure is a new causal structure that can contain
multiple copies of each of the original variables, but where the ancestry of
each copy mirrors that of the original. To every distribution of the observed
variables that is compatible with the original causal structure, we assign a
family of marginal distributions on certain subsets of the copies that are
compatible with the inflated causal structure. It follows that compatibility
constraints for the inflation can be translated into compatibility constraints
for the original causal structure. Even if the constraints at the level of
inflation are weak, such as observable statistical independences implied by
disjoint causal ancestry, the translated constraints can be strong. We apply
this method to derive new inequalities whose violation by a distribution
witnesses that distribution's incompatibility with the causal structure (of
which Bell inequalities and Pearl's instrumental inequality are prominent
examples). We describe an algorithm for deriving all such inequalities for the
original causal structure that follow from ancestral independences in the
inflation. For three observed binary variables with pairwise common causes, it
yields inequalities that are stronger in at least some aspects than those
obtainable by existing methods. We also describe an algorithm that derives a
weaker set of inequalities but is more efficient. Finally, we discuss which
inflations are such that the inequalities one obtains from them remain valid
even for quantum (and post-quantum) generalizations of the notion of a causal
model.Comment: Minor final corrections, updated to match the published version as
closely as possibl
Which causal structures might support a quantum-classical gap?
A causal scenario is a graph that describes the cause and effect
relationships between all relevant variables in an experiment. A scenario is
deemed `not interesting' if there is no device-independent way to distinguish
the predictions of classical physics from any generalised probabilistic theory
(including quantum mechanics). Conversely, an interesting scenario is one in
which there exists a gap between the predictions of different operational
probabilistic theories, as occurs for example in Bell-type experiments. Henson,
Lal and Pusey (HLP) recently proposed a sufficient condition for a causal
scenario to not be interesting. In this paper we supplement their analysis with
some new techniques and results. We first show that existing graphical
techniques due to Evans can be used to confirm by inspection that many graphs
are interesting without having to explicitly search for inequality violations.
For three exceptional cases -- the graphs numbered 15,16,20 in HLP -- we show
that there exist non-Shannon type entropic inequalities that imply these graphs
are interesting. In doing so, we find that existing methods of entropic
inequalities can be greatly enhanced by conditioning on the specific values of
certain variables.Comment: 13 pages, 9 figures, 1 bicycle. Added an appendix showing that
e-separation is strictly more general than the skeleton method. Added journal
referenc
Information Recovery In Behavioral Networks
In the context of agent based modeling and network theory, we focus on the
problem of recovering behavior-related choice information from
origin-destination type data, a topic also known under the name of network
tomography. As a basis for predicting agents' choices we emphasize the
connection between adaptive intelligent behavior, causal entropy maximization
and self-organized behavior in an open dynamic system. We cast this problem in
the form of binary and weighted networks and suggest information theoretic
entropy-driven methods to recover estimates of the unknown behavioral flow
parameters. Our objective is to recover the unknown behavioral values across
the ensemble analytically, without explicitly sampling the configuration space.
In order to do so, we consider the Cressie-Read family of entropic functionals,
enlarging the set of estimators commonly employed to make optimal use of the
available information. More specifically, we explicitly work out two cases of
particular interest: Shannon functional and the likelihood functional. We then
employ them for the analysis of both univariate and bivariate data sets,
comparing their accuracy in reproducing the observed trends.Comment: 14 pages, 6 figures, 4 table
Relating the thermodynamic arrow of time to the causal arrow
Consider a Hamiltonian system that consists of a slow subsystem S and a fast
subsystem F. The autonomous dynamics of S is driven by an effective
Hamiltonian, but its thermodynamics is unexpected. We show that a well-defined
thermodynamic arrow of time (second law) emerges for S whenever there is a
well-defined causal arrow from S to F and the back-action is negligible. This
is because the back-action of F on S is described by a non-globally Hamiltonian
Born-Oppenheimer term that violates the Liouville theorem, and makes the second
law inapplicable to S. If S and F are mixing, under the causal arrow condition
they are described by microcanonic distributions P(S) and P(S|F). Their
structure supports a causal inference principle proposed recently in machine
learning.Comment: 10 page
- …