81 research outputs found
Fast and exact search for the partition with minimal information loss
In analysis of multi-component complex systems, such as neural systems,
identifying groups of units that share similar functionality will aid
understanding of the underlying structures of the system. To find such a
grouping, it is useful to evaluate to what extent the units of the system are
separable. Separability or inseparability can be evaluated by quantifying how
much information would be lost if the system were partitioned into subsystems,
and the interactions between the subsystems were hypothetically removed. A
system of two independent subsystems are completely separable without any loss
of information while a system of strongly interacted subsystems cannot be
separated without a large loss of information. Among all the possible
partitions of a system, the partition that minimizes the loss of information,
called the Minimum Information Partition (MIP), can be considered as the
optimal partition for characterizing the underlying structures of the system.
Although the MIP would reveal novel characteristics of the neural system, an
exhaustive search for the MIP is numerically intractable due to the
combinatorial explosion of possible partitions. Here, we propose a
computationally efficient search to precisely identify the MIP among all
possible partitions by exploiting the submodularity of the measure of
information loss. Mutual information is one such submodular information loss
functions, and is a natural choice for measuring the degree of statistical
dependence between paired sets of random variables. By using mutual information
as a loss function, we show that the search for MIP can be performed in a
practical order of computational time for a reasonably large system. We also
demonstrate that MIP search allows for the detection of underlying global
structures in a network of nonlinear oscillators
Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory
The ability to integrate information in the brain is considered to be an
essential property for cognition and consciousness. Integrated Information
Theory (IIT) hypothesizes that the amount of integrated information () in
the brain is related to the level of consciousness. IIT proposes that to
quantify information integration in a system as a whole, integrated information
should be measured across the partition of the system at which information loss
caused by partitioning is minimized, called the Minimum Information Partition
(MIP). The computational cost for exhaustively searching for the MIP grows
exponentially with system size, making it difficult to apply IIT to real neural
data. It has been previously shown that if a measure of satisfies a
mathematical property, submodularity, the MIP can be found in a polynomial
order by an optimization algorithm. However, although the first version of
is submodular, the later versions are not. In this study, we empirically
explore to what extent the algorithm can be applied to the non-submodular
measures of by evaluating the accuracy of the algorithm in simulated
data and real neural data. We find that the algorithm identifies the MIP in a
nearly perfect manner even for the non-submodular measures. Our results show
that the algorithm allows us to measure in large systems within a
practical amount of time
Mean Field Analysis of Stochastic Neural Network Models with Synaptic Depression
We investigated the effects of synaptic depression on the macroscopic
behavior of stochastic neural networks. Dynamical mean field equations were
derived for such networks by taking the average of two stochastic variables: a
firing state variable and a synaptic variable. In these equations, their
average product is decoupled as the product of averaged them because the two
stochastic variables are independent. We proved the independence of these two
stochastic variables assuming that the synaptic weight is of the order of 1/N
with respect to the number of neurons N. Using these equations, we derived
macroscopic steady state equations for a network with uniform connections and a
ring attractor network with Mexican hat type connectivity and investigated the
stability of the steady state solutions. An oscillatory uniform state was
observed in the network with uniform connections due to a Hopf instability.
With the ring network, high-frequency perturbations were shown not to affect
system stability. Two mechanisms destabilize the inhomogeneous steady state,
leading two oscillatory states. A Turing instability leads to a rotating bump
state, while a Hopf instability leads to an oscillatory bump state, which was
previous unreported. Various oscillatory states take place in a network with
synaptic depression depending on the strength of the interneuron connections.Comment: 26 pages, 13 figures. Preliminary results for the present work have
been published elsewhere (Y Igarashi et al., 2009.
http://www.iop.org/EJ/abstract/1742-6596/197/1/012018
A unified framework for information integration based on information geometry
We propose a unified theoretical framework for quantifying spatio-temporal
interactions in a stochastic dynamical system based on information geometry. In
the proposed framework, the degree of interactions is quantified by the
divergence between the actual probability distribution of the system and a
constrained probability distribution where the interactions of interest are
disconnected. This framework provides novel geometric interpretations of
various information theoretic measures of interactions, such as mutual
information, transfer entropy, and stochastic interaction in terms of how
interactions are disconnected. The framework therefore provides an intuitive
understanding of the relationships between the various quantities. By extending
the concept of transfer entropy, we propose a novel measure of integrated
information which measures causal interactions between parts of a system.
Integrated information quantifies the extent to which the whole is more than
the sum of the parts and can be potentially used as a biological measure of the
levels of consciousness
Unified framework for the entropy production and the stochastic interaction based on information geometry
We show a relationship between the entropy production in stochastic
thermodynamics and the stochastic interaction in the information integrated
theory. To clarify this relationship, we newly introduce an information
geometric interpretation of the entropy production for a total system and the
partial entropy productions for subsystems. We show that the violation of the
additivity of the entropy productions is related to the stochastic interaction.
This framework is a thermodynamic foundation of the integrated information
theory. We also show that our information geometric formalism leads to a novel
expression of the entropy production related to an optimization problem
minimizing the Kullback-Leibler divergence. We analytically illustrate this
interpretation by using the spin model.Comment: 13pages, 4 figure
Measuring integrated information from the decoding perspective
Accumulating evidence indicates that the capacity to integrate information in
the brain is a prerequisite for consciousness. Integrated Information Theory
(IIT) of consciousness provides a mathematical approach to quantifying the
information integrated in a system, called integrated information, .
Integrated information is defined theoretically as the amount of information a
system generates as a whole, above and beyond the sum of the amount of
information its parts independently generate. IIT predicts that the amount of
integrated information in the brain should reflect levels of consciousness.
Empirical evaluation of this theory requires computing integrated information
from neural data acquired from experiments, although difficulties with using
the original measure precludes such computations. Although some
practical measures have been previously proposed, we found that these measures
fail to satisfy the theoretical requirements as a measure of integrated
information. Measures of integrated information should satisfy the lower and
upper bounds as follows: The lower bound of integrated information should be 0
when the system does not generate information (no information) or when the
system comprises independent parts (no integration). The upper bound of
integrated information is the amount of information generated by the whole
system and is realized when the amount of information generated independently
by its parts equals to 0. Here we derive the novel practical measure
by introducing a concept of mismatched decoding developed from information
theory. We show that is properly bounded from below and above, as
required, as a measure of integrated information. We derive the analytical
expression under the Gaussian assumption, which makes it readily
applicable to experimental data
- …