213,119 research outputs found
Prediction, Retrodiction, and The Amount of Information Stored in the Present
We introduce an ambidextrous view of stochastic dynamical systems, comparing
their forward-time and reverse-time representations and then integrating them
into a single time-symmetric representation. The perspective is useful
theoretically, computationally, and conceptually. Mathematically, we prove that
the excess entropy--a familiar measure of organization in complex systems--is
the mutual information not only between the past and future, but also between
the predictive and retrodictive causal states. Practically, we exploit the
connection between prediction and retrodiction to directly calculate the excess
entropy. Conceptually, these lead one to discover new system invariants for
stochastic dynamical systems: crypticity (information accessibility) and causal
irreversibility. Ultimately, we introduce a time-symmetric representation that
unifies all these quantities, compressing the two directional representations
into one. The resulting compression offers a new conception of the amount of
information stored in the present.Comment: 17 pages, 7 figures, 1 table;
http://users.cse.ucdavis.edu/~cmg/compmech/pubs/pratisp.ht
Information sharing in Quantum Complex Networks
We introduce the use of entanglement entropy as a tool for studying the
amount of information shared between the nodes of quantum complex networks. By
considering the ground state of a network of coupled quantum harmonic
oscillators, we compute the information that each node has on the rest of the
system. We show that the nodes storing the largest amount of information are
not the ones with the highest connectivity, but those with intermediate
connectivity thus breaking down the usual hierarchical picture of classical
networks. We show both numerically and analytically that the mutual information
characterizes the network topology. As a byproduct, our results point out that
the amount of information available for an external node connecting to a
quantum network allows to determine the network topology.Comment: text and title updated, published version [Phys. Rev. A 87, 052312
(2013)
Estimating the Amount of Information Carried by a Neuronal Population
Although all brain functions require coordinated activity of many neurons, it has been difficult to estimate the amount of information carried by a population of spiking neurons. We present here a Fourier-based method for estimating the information delivery rate from a population of neurons, which allows us to measure the redundancy of information within and between functional neuronal classes. We illustrate the use of the method on some artificial spike trains and on simultaneous recordings from a small population of neurons from the lateral geniculate nucleus of an anesthetized macaque monkey
Amount of Information Needed for Model Choice in Approximate Bayesian Computation
Approximate Bayesian Computation (ABC) has become a popular technique in evolutionary genetics for elucidating population structure and history due to its flexibility. The statistical inference framework has benefited from significant progress in recent years. In population genetics, however, its outcome depends heavily on the amount of information in the dataset, whether that be the level of genetic variation or the number of samples and loci. Here we look at the power to reject a simple constant population size coalescent model in favor of a bottleneck model in datasets of varying quality. Not only is this power dependent on the number of samples and loci, but it also depends strongly on the level of nucleotide diversity in the observed dataset. Whilst overall model choice in an ABC setting is fairly powerful and quite conservative with regard to false positives, detecting weaker bottlenecks is problematic in smaller or less genetically diverse datasets and limits the inferences possible in non-model organism where the amount of information regarding the two models is often limited. Our results show it is important to consider these limitations when performing an ABC analysis and that studies should perform simulations based on the size and nature of the dataset in order to fully assess the power of the study
Optimizing a Decision-Maker's Preferences with a Minimum Amount of Information
The aim here is to obtain the minimum amount of information required from a decision maker in order to find his best alternative in a given problem
Global and local Complexity in weakly chaotic dynamical systems
In a topological dynamical system the complexity of an orbit is a measure of
the amount of information (algorithmic information content) that is necessary
to describe the orbit. This indicator is invariant up to topological
conjugation. We consider this indicator of local complexity of the dynamics and
provide different examples of its behavior, showing how it can be useful to
characterize various kind of weakly chaotic dynamics. We also provide criteria
to find systems with non trivial orbit complexity (systems where the
description of the whole orbit requires an infinite amount of information). We
consider also a global indicator of the complexity of the system. This global
indicator generalizes the topological entropy, taking into account systems were
the number of essentially different orbits increases less than exponentially.
Then we prove that if the system is constructive (roughly speaking: if the map
can be defined up to any given accuracy using a finite amount of information)
the orbit complexity is everywhere less or equal than the generalized
topological entropy. Conversely there are compact non constructive examples
where the inequality is reversed, suggesting that this notion comes out
naturally in this kind of complexity questions.Comment: 23 page
Computational capacity of the universe
Merely by existing, all physical systems register information. And by
evolving dynamically in time, they transform and process that information. The
laws of physics determine the amount of information that a physical system can
register (number of bits) and the number of elementary logic operations that a
system can perform (number of ops). The universe is a physical system. This
paper quantifies the amount of information that the universe can register and
the number of elementary operations that it can have performed over its
history. The universe can have performed no more than ops on
bits.Comment: 17 pages, TeX. submitted to Natur
Estimating the Amount of Information Conveyed by a Population of Neurons
Recent technological advances have made the simultaneous recording of the activity of many neurons common. However, estimating the amount of information conveyed by the discharge of a neural population remains a significant challenge. Here we describe our recently published analysis method that assists in such estimates. We describe the key concepts and assumptions on which the method is based, illustrate its use with data from both simulated and real neurons recorded from the lateral geniculate nucleus of a monkey, and show how it can be used to calculate redundancy and synergy among neuronal groups
- …