11,051 research outputs found

    Bits from Biology for Computational Intelligence

    Get PDF
    Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the necessary introduction to information theory and the estimation of information theoretic quantities from neural data. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is decomposed into component processes of information storage, transfer, and modification -- locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems

    Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes

    Get PDF
    Exploiting the theory of state space models, we derive the exact expressions of the information transfer, as well as redundant and synergistic transfer, for coupled Gaussian processes observed at multiple temporal scales. All of the terms, constituting the frameworks known as interaction information decomposition and partial information decomposition, can thus be analytically obtained for different time scales from the parameters of the VAR model that fits the processes. We report the application of the proposed methodology firstly to benchmark Gaussian systems, showing that this class of systems may generate patterns of information decomposition characterized by mainly redundant or synergistic information transfer persisting across multiple time scales or even by the alternating prevalence of redundant and synergistic source interaction depending on the time scale. Then, we apply our method to an important topic in neuroscience, i.e., the detection of causal interactions in human epilepsy networks, for which we show the relevance of partial information decomposition to the detection of multiscale information transfer spreading from the seizure onset zone

    Extreme reductions of entropy in an electronic double dot

    Get PDF
    We experimentally study negative fluctuations of stochastic entropy production in an electronic double dot operating in nonequilibrium steady-state conditions. We record millions of random electron tunneling events at different bias points, thus collecting extensive statistics. We show that for all bias voltages the experimental average values of the minima of stochastic entropy production lie above kB-k_B, where kBk_B is the Boltzmann constant, in agreement with recent theoretical predictions for nonequilibrium steady states. Furthermore, we also demonstrate that the experimental cumulative distribution of the entropy production minima is bounded, at all times and for all bias voltages, by a universal expression predicted by the theory. We also extend our theory by deriving a general bound for the average value of the maximum heat absorbed by a mesoscopic system from the environment and compare this result with experimental data. Finally, we show by numerical simulations that these results are not necessarily valid under non-stationary conditions.Comment: 16 pages, 12 figure

    Unnatural Selection: A new formal approach to punctuated equilibrium in economic systems

    Get PDF
    Generalized Darwinian evolutionary theory has emerged as central to the description of economic process (e.g., Aldrich et. al., 2008). Here we demonstrate that, just as Darwinian principles provide necessary, but not sufficient, conditions for understanding the dynamics of social entities, in a similar manner the asymptotic limit theorems of information theory provide another set of necessary conditions that constrain the evolution of socioeconomic process. These latter constraints can, however, easily be formulated as a statistics-like analytic toolbox for the study of empirical data that is consistent with a generalized Darwinism, and this is no small thing

    Medicine beyond magic bullets: a formal case for multilevel interventions

    Get PDF
    Western medicine's paradigmatic search for 'magic bullet' interventions is facing increasing difficulty: Between 1950 and 2010 the inflation-adjusted cost per USFDA-approved drug has increased exponentially in time, a draconian inverse of the famous Moore's Law of computing. A sequence of empirically-oriented statistical models suggests that carefully designed synergistic multifactorial and multiscale strategies might evade this relationship

    Universal Coding on Infinite Alphabets: Exponentially Decreasing Envelopes

    Full text link
    This paper deals with the problem of universal lossless coding on a countable infinite alphabet. It focuses on some classes of sources defined by an envelope condition on the marginal distribution, namely exponentially decreasing envelope classes with exponent α\alpha. The minimax redundancy of exponentially decreasing envelope classes is proved to be equivalent to 14αlogelog2n\frac{1}{4 \alpha \log e} \log^2 n. Then a coding strategy is proposed, with a Bayes redundancy equivalent to the maximin redundancy. At last, an adaptive algorithm is provided, whose redundancy is equivalent to the minimax redundanc
    corecore