1,019 research outputs found
Fast and exact search for the partition with minimal information loss
In analysis of multi-component complex systems, such as neural systems,
identifying groups of units that share similar functionality will aid
understanding of the underlying structures of the system. To find such a
grouping, it is useful to evaluate to what extent the units of the system are
separable. Separability or inseparability can be evaluated by quantifying how
much information would be lost if the system were partitioned into subsystems,
and the interactions between the subsystems were hypothetically removed. A
system of two independent subsystems are completely separable without any loss
of information while a system of strongly interacted subsystems cannot be
separated without a large loss of information. Among all the possible
partitions of a system, the partition that minimizes the loss of information,
called the Minimum Information Partition (MIP), can be considered as the
optimal partition for characterizing the underlying structures of the system.
Although the MIP would reveal novel characteristics of the neural system, an
exhaustive search for the MIP is numerically intractable due to the
combinatorial explosion of possible partitions. Here, we propose a
computationally efficient search to precisely identify the MIP among all
possible partitions by exploiting the submodularity of the measure of
information loss. Mutual information is one such submodular information loss
functions, and is a natural choice for measuring the degree of statistical
dependence between paired sets of random variables. By using mutual information
as a loss function, we show that the search for MIP can be performed in a
practical order of computational time for a reasonably large system. We also
demonstrate that MIP search allows for the detection of underlying global
structures in a network of nonlinear oscillators
Recommended from our members
Laser Sintering Fabrication of Highly Porous Models Utilizing Water Leachable Filler-Experimental Investigation into Process Parameters
The authors are developing a laser sintering process to fabricate highly porous
models with such high porosities as 90% and more. In the process, water-soluble filler is
mixed with designated plastic powder and leached out after laser sintering process is
finished to generate pores where the grains used to exist. Previously, the authors
reported successful application of this technology on a tissue engineering scaffold.
However, relationship between process parameters and obtained results has not been
clarified. This paper reports experimental investigation into effects of optimizing
process parameters such as mixture, grain size of the filler on resultant porosity, pore
size and process resolutionMechanical Engineerin
Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory
The ability to integrate information in the brain is considered to be an
essential property for cognition and consciousness. Integrated Information
Theory (IIT) hypothesizes that the amount of integrated information () in
the brain is related to the level of consciousness. IIT proposes that to
quantify information integration in a system as a whole, integrated information
should be measured across the partition of the system at which information loss
caused by partitioning is minimized, called the Minimum Information Partition
(MIP). The computational cost for exhaustively searching for the MIP grows
exponentially with system size, making it difficult to apply IIT to real neural
data. It has been previously shown that if a measure of satisfies a
mathematical property, submodularity, the MIP can be found in a polynomial
order by an optimization algorithm. However, although the first version of
is submodular, the later versions are not. In this study, we empirically
explore to what extent the algorithm can be applied to the non-submodular
measures of by evaluating the accuracy of the algorithm in simulated
data and real neural data. We find that the algorithm identifies the MIP in a
nearly perfect manner even for the non-submodular measures. Our results show
that the algorithm allows us to measure in large systems within a
practical amount of time
Geometry of Information Integration
Information geometry is used to quantify the amount of information
integration within multiple terminals of a causal dynamical system. Integrated
information quantifies how much information is lost when a system is split into
parts and information transmission between the parts is removed. Multiple
measures have been proposed as a measure of integrated information. Here, we
analyze four of the previously proposed measures and elucidate their relations
from a viewpoint of information geometry. Two of them use dually flat manifolds
and the other two use curved manifolds to define a split model. We show that
there are hierarchical structures among the measures. We provide explicit
expressions of these measures
Mean Field Analysis of Stochastic Neural Network Models with Synaptic Depression
We investigated the effects of synaptic depression on the macroscopic
behavior of stochastic neural networks. Dynamical mean field equations were
derived for such networks by taking the average of two stochastic variables: a
firing state variable and a synaptic variable. In these equations, their
average product is decoupled as the product of averaged them because the two
stochastic variables are independent. We proved the independence of these two
stochastic variables assuming that the synaptic weight is of the order of 1/N
with respect to the number of neurons N. Using these equations, we derived
macroscopic steady state equations for a network with uniform connections and a
ring attractor network with Mexican hat type connectivity and investigated the
stability of the steady state solutions. An oscillatory uniform state was
observed in the network with uniform connections due to a Hopf instability.
With the ring network, high-frequency perturbations were shown not to affect
system stability. Two mechanisms destabilize the inhomogeneous steady state,
leading two oscillatory states. A Turing instability leads to a rotating bump
state, while a Hopf instability leads to an oscillatory bump state, which was
previous unreported. Various oscillatory states take place in a network with
synaptic depression depending on the strength of the interneuron connections.Comment: 26 pages, 13 figures. Preliminary results for the present work have
been published elsewhere (Y Igarashi et al., 2009.
http://www.iop.org/EJ/abstract/1742-6596/197/1/012018
A unified framework for information integration based on information geometry
We propose a unified theoretical framework for quantifying spatio-temporal
interactions in a stochastic dynamical system based on information geometry. In
the proposed framework, the degree of interactions is quantified by the
divergence between the actual probability distribution of the system and a
constrained probability distribution where the interactions of interest are
disconnected. This framework provides novel geometric interpretations of
various information theoretic measures of interactions, such as mutual
information, transfer entropy, and stochastic interaction in terms of how
interactions are disconnected. The framework therefore provides an intuitive
understanding of the relationships between the various quantities. By extending
the concept of transfer entropy, we propose a novel measure of integrated
information which measures causal interactions between parts of a system.
Integrated information quantifies the extent to which the whole is more than
the sum of the parts and can be potentially used as a biological measure of the
levels of consciousness
- …