Abstract

We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global correlation and structure for spatial systems in any dimension. We compare and contrast entropy-convergence with mutual-information and structure-factor techniques for quantifying and detecting spatial structure.Comment: 11 pages, 5 figures, http://www.santafe.edu/projects/CompMech/papers/2dnnn.htm

    Similar works