10,949 research outputs found
What Is a Macrostate? Subjective Observations and Objective Dynamics
We consider the question of whether thermodynamic macrostates are objective
consequences of dynamics, or subjective reflections of our ignorance of a
physical system. We argue that they are both; more specifically, that the set
of macrostates forms the unique maximal partition of phase space which 1) is
consistent with our observations (a subjective fact about our ability to
observe the system) and 2) obeys a Markov process (an objective fact about the
system's dynamics). We review the ideas of computational mechanics, an
information-theoretic method for finding optimal causal models of stochastic
processes, and argue that macrostates coincide with the ``causal states'' of
computational mechanics. Defining a set of macrostates thus consists of an
inductive process where we start with a given set of observables, and then
refine our partition of phase space until we reach a set of states which
predict their own future, i.e. which are Markovian. Macrostates arrived at in
this way are provably optimal statistical predictors of the future values of
our observables.Comment: 15 pages, no figure
Ensemble Inhibition and Excitation in the Human Cortex: an Ising Model Analysis with Uncertainties
The pairwise maximum entropy model, also known as the Ising model, has been
widely used to analyze the collective activity of neurons. However, controversy
persists in the literature about seemingly inconsistent findings, whose
significance is unclear due to lack of reliable error estimates. We therefore
develop a method for accurately estimating parameter uncertainty based on
random walks in parameter space using adaptive Markov Chain Monte Carlo after
the convergence of the main optimization algorithm. We apply our method to the
spiking patterns of excitatory and inhibitory neurons recorded with
multielectrode arrays in the human temporal cortex during the wake-sleep cycle.
Our analysis shows that the Ising model captures neuronal collective behavior
much better than the independent model during wakefulness, light sleep, and
deep sleep when both excitatory (E) and inhibitory (I) neurons are modeled;
ignoring the inhibitory effects of I-neurons dramatically overestimates
synchrony among E-neurons. Furthermore, information-theoretic measures reveal
that the Ising model explains about 80%-95% of the correlations, depending on
sleep state and neuron type. Thermodynamic measures show signatures of
criticality, although we take this with a grain of salt as it may be merely a
reflection of long-range neural correlations.Comment: 17 pages, 8 figure
-MLE: A fast algorithm for learning statistical mixture models
We describe -MLE, a fast and efficient local search algorithm for learning
finite statistical mixtures of exponential families such as Gaussian mixture
models. Mixture models are traditionally learned using the
expectation-maximization (EM) soft clustering technique that monotonically
increases the incomplete (expected complete) likelihood. Given prescribed
mixture weights, the hard clustering -MLE algorithm iteratively assigns data
to the most likely weighted component and update the component models using
Maximum Likelihood Estimators (MLEs). Using the duality between exponential
families and Bregman divergences, we prove that the local convergence of the
complete likelihood of -MLE follows directly from the convergence of a dual
additively weighted Bregman hard clustering. The inner loop of -MLE can be
implemented using any -means heuristic like the celebrated Lloyd's batched
or Hartigan's greedy swap updates. We then show how to update the mixture
weights by minimizing a cross-entropy criterion that implies to update weights
by taking the relative proportion of cluster points, and reiterate the mixture
parameter update and mixture weight update processes until convergence. Hard EM
is interpreted as a special case of -MLE when both the component update and
the weight update are performed successively in the inner loop. To initialize
-MLE, we propose -MLE++, a careful initialization of -MLE guaranteeing
probabilistically a global bound on the best possible complete likelihood.Comment: 31 pages, Extend preliminary paper presented at IEEE ICASSP 201
- …