127,659 research outputs found

    Limit results on pattern entropy

    Full text link

    A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics

    Get PDF
    From the output produced by a memoryless deletion channel from a uniformly random input of known length nn, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length mm, and it is natural to ask for which outputs this is extremized. This question was posed in a previous work, where it was conjectured on the basis of experimental data that the entropy of the posterior is minimized and maximized by the constant strings 000…\texttt{000}\ldots and 111…\texttt{111}\ldots and the alternating strings 0101…\texttt{0101}\ldots and 1010…\texttt{1010}\ldots respectively. In the present work we confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics. We show how the analytic-combinatorial methods of Flajolet, Szpankowski and Vall\'ee for dealing with the hidden pattern matching problem can be applied to resolve the case of fixed output length and n→∞n\rightarrow\infty, by obtaining estimates for the entropy in terms of the moments of the posterior distribution and establishing its minimization via a measure of autocorrelation.Comment: 11 pages, 2 figure

    Polydisperse Adsorption: Pattern Formation Kinetics, Fractal Properties, and Transition to Order

    Full text link
    We investigate the process of random sequential adsorption of polydisperse particles whose size distribution exhibits a power-law dependence in the small size limit, P(R)∼Rα−1P(R)\sim R^{\alpha-1}. We reveal a relation between pattern formation kinetics and structural properties of arising patterns. We propose a mean-field theory which provides a fair description for sufficiently small α\alpha. When α→∞\alpha \to \infty, highly ordered structures locally identical to the Apollonian packing are formed. We introduce a quantitative criterion of the regularity of the pattern formation process. When α≫1\alpha \gg 1, a sharp transition from irregular to regular pattern formation regime is found to occur near the jamming coverage of standard random sequential adsorption with monodisperse size distribution.Comment: 8 pages, LaTeX, 5 figures, to appear in Phys.Rev.

    Chaos for Liouville probability densities

    Full text link
    Using the method of symbolic dynamics, we show that a large class of classical chaotic maps exhibit exponential hypersensitivity to perturbation, i.e., a rapid increase with time of the information needed to describe the perturbed time evolution of the Liouville density, the information attaining values that are exponentially larger than the entropy increase that results from averaging over the perturbation. The exponential rate of growth of the ratio of information to entropy is given by the Kolmogorov-Sinai entropy of the map. These findings generalize and extend results obtained for the baker's map [R. Schack and C. M. Caves, Phys. Rev. Lett. 69, 3413 (1992)].Comment: 26 pages in REVTEX, no figures, submitted to Phys. Rev.

    Fractal formation and ordering in random sequential adsorption

    Full text link
    We reveal the fractal nature of patterns arising in random sequential adsorption of particles with continuum power-law size distribution, P(R)∼Rα−1P(R)\sim R^{\alpha-1}, R≤RmaxR \le R_{\rm max}. We find that the patterns become more and more ordered as α\alpha increases, and that the Apollonian packing is obtained at α→∞\alpha \to \infty limit. We introduce the entropy production rate as a quantitative criteria of regularity and observe a transition from an irregular regime of the pattern formation to a regular one. We develop a scaling theory that relates kinetic and structural properties of the system.Comment: 4 pages, RevTex, 4 postscript figures. To appear in Phys.Rev.Let

    Structural Information in Two-Dimensional Patterns: Entropy Convergence and Excess Entropy

    Full text link
    We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global correlation and structure for spatial systems in any dimension. We compare and contrast entropy-convergence with mutual-information and structure-factor techniques for quantifying and detecting spatial structure.Comment: 11 pages, 5 figures, http://www.santafe.edu/projects/CompMech/papers/2dnnn.htm
    • …
    corecore