13,981 research outputs found

    Limits of relative entropies associated with weakly interacting particle systems

    Get PDF
    The limits of scaled relative entropies between probability distributions associated with N-particle weakly interacting Markov processes are considered. The convergence of such scaled relative entropies is established in various settings. The analysis is motivated by the role relative entropy plays as a Lyapunov function for the (linear) Kolmogorov forward equation associated with an ergodic Markov process, and Lyapunov function properties of these scaling limits with respect to nonlinear finite-state Markov processes are studied in the companion paper [6]

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl

    On Convergence Properties of Shannon Entropy

    Full text link
    Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies. A general result for the desired differential entropy convergence is provided, taking into account both compactly and uncompactly supported densities. Convergence of differential entropy is also characterized in terms of the Kullback-Liebler discriminant for densities with fairly general supports, and it is shown that convergence in variation of probability measures guarantees such convergence under an appropriate boundedness condition on the densities involved. Results for the discrete setting are also provided, allowing for infinitely supported probability measures, by taking advantage of the equivalence between weak convergence and convergence in variation in this setting.Comment: Submitted to IEEE Transactions on Information Theor

    Structural Information in Two-Dimensional Patterns: Entropy Convergence and Excess Entropy

    Full text link
    We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global correlation and structure for spatial systems in any dimension. We compare and contrast entropy-convergence with mutual-information and structure-factor techniques for quantifying and detecting spatial structure.Comment: 11 pages, 5 figures, http://www.santafe.edu/projects/CompMech/papers/2dnnn.htm

    Origin of entropy convergence in hydrophobic hydration and protein folding

    Get PDF
    An information theory model is used to construct a molecular explanation why hydrophobic solvation entropies measured in calorimetry of protein unfolding converge at a common temperature. The entropy convergence follows from the weak temperature dependence of occupancy fluctuations for molecular-scale volumes in water. The macroscopic expression of the contrasting entropic behavior between water and common organic solvents is the relative temperature insensitivity of the water isothermal compressibility. The information theory model provides a quantitative description of small molecule hydration and predicts a negative entropy at convergence. Interpretations of entropic contributions to protein folding should account for this result.Comment: Phys. Rev. Letts. (in press 1996), 3 pages, 3 figure

    The conditional entropy power inequality for quantum additive noise channels

    Get PDF
    We prove the quantum conditional Entropy Power Inequality for quantum additive noise channels. This inequality lower bounds the quantum conditional entropy of the output of an additive noise channel in terms of the quantum conditional entropies of the input state and the noise when they are conditionally independent given the memory. We also show that this conditional Entropy Power Inequality is optimal in the sense that we can achieve equality asymptotically by choosing a suitable sequence of Gaussian input states. We apply the conditional Entropy Power Inequality to find an array of information-theoretic inequalities for conditional entropies which are the analogues of inequalities which have already been established in the unconditioned setting. Furthermore, we give a simple proof of the convergence rate of the quantum Ornstein-Uhlenbeck semigroup based on Entropy Power Inequalities.Comment: 26 pages; updated to match published versio

    Symmetries and global solvability of the isothermal gas dynamics equations

    Full text link
    We study the Cauchy problem associated with the system of two conservation laws arising in isothermal gas dynamics, in which the pressure and the density are related by the γ\gamma-law equation p(ρ)ργp(\rho) \sim \rho^\gamma with γ=1\gamma =1. Our results complete those obtained earlier for γ>1\gamma >1. We prove the global existence and compactness of entropy solutions generated by the vanishing viscosity method. The proof relies on compensated compactness arguments and symmetry group analysis. Interestingly, we make use here of the fact that the isothermal gas dynamics system is invariant modulo a linear scaling of the density. This property enables us to reduce our problem to that with a small initial density. One symmetry group associated with the linear hyperbolic equations describing all entropies of the Euler equations gives rise to a fundamental solution with initial data imposed to the line ρ=1\rho=1. This is in contrast to the common approach (when γ>1\gamma >1) which prescribes initial data on the vacuum line ρ=0\rho =0. The entropies we construct here are weak entropies, i.e. they vanish when the density vanishes. Another feature of our proof lies in the reduction theorem which makes use of the family of weak entropies to show that a Young measure must reduce to a Dirac mass. This step is based on new convergence results for regularized products of measures and functions of bounded variation.Comment: 29 page
    corecore