28,113 research outputs found
Optimal Kullback-Leibler Aggregation via Information Bottleneck
In this paper, we present a method for reducing a regular, discrete-time
Markov chain (DTMC) to another DTMC with a given, typically much smaller number
of states. The cost of reduction is defined as the Kullback-Leibler divergence
rate between a projection of the original process through a partition function
and a DTMC on the correspondingly partitioned state space. Finding the reduced
model with minimal cost is computationally expensive, as it requires an
exhaustive search among all state space partitions, and an exact evaluation of
the reduction cost for each candidate partition. Our approach deals with the
latter problem by minimizing an upper bound on the reduction cost instead of
minimizing the exact cost; The proposed upper bound is easy to compute and it
is tight if the original chain is lumpable with respect to the partition. Then,
we express the problem in the form of information bottleneck optimization, and
propose using the agglomerative information bottleneck algorithm for searching
a sub-optimal partition greedily, rather than exhaustively. The theory is
illustrated with examples and one application scenario in the context of
modeling bio-molecular interactions.Comment: 13 pages, 4 figure
Learning in evolutionary environments
Not availabl
The Logic of Experimental Tests, Particularly of Everettian Quantum Theory
Claims that the standard methodology of scientific testing is inapplicable to
Everettian quantum theory, and hence that the theory is untestable, are due to
misconceptions about probability and about the logic of experimental testing.
Refuting those claims by correcting those misconceptions leads to various
simplifications, notably the elimination of everything probabilistic from
fundamental physics (stochastic processes) and from the methodology of testing
('Bayesian' credences)
- …