6 research outputs found
Expressing the entropy of lattice systems as sums of conditional entropies
Whether a system is to be considered complex or not depends on how one
searches for correlations. We propose a general scheme for calculation of
entropies in lattice systems that has high flexibility in how correlations are
successively taken into account. Compared to the traditional approach for
estimating the entropy density, in which successive approximations builds on
step-wise extensions of blocks of symbols, we show that one can take larger
steps when collecting the statistics necessary to calculate the entropy density
of the system. In one dimension this means that, instead of a single sweep over
the system in which states are read sequentially, one take several sweeps with
larger steps so that eventually the whole lattice is covered. This means that
the information in correlations is captured in a different way, and in some
situations this will lead to a considerably much faster convergence of the
entropy density estimate as a function of the size of the configurations used
in the estimate. The formalism is exemplified with both an example of a free
energy minimisation scheme for the two-dimensional Ising model, and an example
of increasingly complex spatial correlations generated by the time evolution of
elementary cellular automaton rule 60
The approach towards equilibrium in a reversible Ising dynamics model -- an information-theoretic analysis based on an exact solution
We study the approach towards equilibrium in a dynamic Ising model, the Q2R
cellular automaton, with microscopic reversibility and conserved energy for an
infinite one-dimensional system. Starting from a low-entropy state with
positive magnetisation, we investigate how the system approaches equilibrium
characteristics given by statistical mechanics. We show that the magnetisation
converges to zero exponentially. The reversibility of the dynamics implies that
the entropy density of the microstates is conserved in the time evolution.
Still, it appears as if equilibrium, with a higher entropy density is
approached. In order to understand this process, we solve the dynamics by
formally proving how the information-theoretic characteristics of the
microstates develop over time. With this approach we can show that an estimate
of the entropy density based on finite length statistics within microstates
converges to the equilibrium entropy density. The process behind this apparent
entropy increase is a dissipation of correlation information over increasing
distances. It is shown that the average information-theoretic correlation
length increases linearly in time, being equivalent to a corresponding increase
in excess entropy.Comment: 15 pages, 2 figure
Renormalization of cellular automata and self-similarity
We study self-similarity in one-dimensional probabilistic cellular automata
(PCA) using the renormalization technique. We introduce a general framework for
algebraic construction of renormalization groups (RG) on cellular automata and
apply it to exhaustively search the rule space for automata displaying dynamic
criticality. Previous studies have shown that there exists several exactly
renormalizable deterministic automata. We show that the RG fixed points for
such self-similar CA are unstable in all directions under renormalization. This
implies that the large scale structure of self-similar deterministic elementary
cellular automata is destroyed by any finite error probability. As a second
result we show that the only non-trivial critical PCA are the different
versions of the well-studied phenomenon of directed percolation. We discuss how
the second result supports a conjecture regarding the universality class for
dynamic criticality defined by directed percolation.Comment: 14 pages, 4 figure
Complexity of Two-Dimensional Patterns
In dynamical systems such as cellular automata and iterated maps, it is often
useful to look at a language or set of symbol sequences produced by the system.
There are well-established classification schemes, such as the Chomsky
hierarchy, with which we can measure the complexity of these sets of sequences,
and thus the complexity of the systems which produce them.
In this paper, we look at the first few levels of a hierarchy of complexity
for two-or-more-dimensional patterns. We show that several definitions of
``regular language'' or ``local rule'' that are equivalent in d=1 lead to
distinct classes in d >= 2. We explore the closure properties and computational
complexity of these classes, including undecidability and L-, NL- and
NP-completeness results.
We apply these classes to cellular automata, in particular to their sets of
fixed and periodic points, finite-time images, and limit sets. We show that it
is undecidable whether a CA in d >= 2 has a periodic point of a given period,
and that certain ``local lattice languages'' are not finite-time images or
limit sets of any CA. We also show that the entropy of a d-dimensional CA's
finite-time image cannot decrease faster than t^{-d} unless it maps every
initial condition to a single homogeneous state.Comment: To appear in J. Stat. Phy