4 research outputs found

    Polyadic Entropy, Synergy and Redundancy among Statistically Independent Processes in Nonlinear Statistical Physics with Microphysical Codependence

    Full text link
    The information shared among observables representing processes of interest is traditionally evaluated in terms of macroscale measures characterizing aggregate properties of the underlying processes and their interactions. Traditional information measures are grounded on the assumption that the observable represents a memoryless process without any interaction among microstates. Generalized entropy measures have been formulated in non-extensive statistical mechanics aiming to take microphysical codependence into account in entropy quantification. By taking them into consideration when formulating information measures, the question is raised on whether and if so how much information permeates across scales to impact on the macroscale information measures. The present study investigates and quantifies the emergence of macroscale information from microscale codependence among microphysics. In order to isolate the information emergence coming solely from the nonlinearly interacting microphysics, redundancy and synergy are evaluated among macroscale variables that are statistically independent from each other but not necessarily so within their own microphysics. Synergistic and redundant information are found when microphysical interactions take place, even if the statistical distributions are factorable. These findings stress the added value of nonlinear statistical physics to information theory in coevolutionary systems

    Debates: Does Information Theory Provide a New Paradigm for Earth Science? Emerging Concepts and Pathways of Information Physics

    Get PDF
    Entropy and Information are key concepts not only in Information Theory but also in Physics: historically in the fields of Thermodynamics, Statistical and Analytical Mechanics, and, more recently, in the field of Information Physics. In this paper we argue that Information Physics reconciles and generalizes statistical, geometric, and mechanistic views on information. We start by demonstrating how the use and interpretation of Entropy and Information coincide in Information Theory, Statistical Thermodynamics, and Analytical Mechanics, and how this can be taken advantage of when addressing Earth Science problems in general and hydrological problems in particular. In the second part we discuss how Information Physics provides ways to quantify Information and Entropy from fundamental physical principles. This extends their use to cases where the preconditions to calculate Entropy in the classical manner as an aggregate statistical measure are not met. Indeed, these preconditions are rarely met in the Earth Sciences due either to limited observations or the far-from-equilibrium nature of evolving systems. Information Physics therefore offers new opportunities for improving the treatment of Earth Science problems.info:eu-repo/semantics/publishedVersio

    Polyadic Entropy, Synergy and Redundancy among Statistically Independent Processes in Nonlinear Statistical Physics with Microphysical Codependence

    No full text
    The information shared among observables representing processes of interest is traditionally evaluated in terms of macroscale measures characterizing aggregate properties of the underlying processes and their interactions. Traditional information measures are grounded on the assumption that the observable represents a memoryless process without any interaction among microstates. Generalized entropy measures have been formulated in non-extensive statistical mechanics aiming to take microphysical codependence into account in entropy quantification. By taking them into consideration when formulating information measures, the question is raised on whether and if so how much information permeates across scales to impact on the macroscale information measures. The present study investigates and quantifies the emergence of macroscale information from microscale codependence among microphysics. In order to isolate the information emergence coming solely from the nonlinearly interacting microphysics, redundancy and synergy are evaluated among macroscale variables that are statistically independent from each other but not necessarily so within their own microphysics. Synergistic and redundant information are found when microphysical interactions take place, even if the statistical distributions are factorable. These findings stress the added value of nonlinear statistical physics to information theory in coevolutionary systems
    corecore