882 research outputs found

    PyPhi: A toolbox for integrated information theory

    Full text link
    Integrated information theory provides a mathematical framework to fully characterize the cause-effect structure of a physical system. Here, we introduce PyPhi, a Python software package that implements this framework for causal analysis and unfolds the full cause-effect structure of discrete dynamical systems of binary elements. The software allows users to easily study these structures, serves as an up-to-date reference implementation of the formalisms of integrated information theory, and has been applied in research on complexity, emergence, and certain biological questions. We first provide an overview of the main algorithm and demonstrate PyPhi's functionality in the course of analyzing an example system, and then describe details of the algorithm's design and implementation. PyPhi can be installed with Python's package manager via the command 'pip install pyphi' on Linux and macOS systems equipped with Python 3.4 or higher. PyPhi is open-source and licensed under the GPLv3; the source code is hosted on GitHub at https://github.com/wmayner/pyphi . Comprehensive and continually-updated documentation is available at https://pyphi.readthedocs.io/ . The pyphi-users mailing list can be joined at https://groups.google.com/forum/#!forum/pyphi-users . A web-based graphical interface to the software is available at http://integratedinformationtheory.org/calculate.html .Comment: 22 pages, 4 figures, 6 pages of appendices. Supporting information "S1 Calculating Phi" can be found in the ancillary file

    A topological approach to neural complexity

    Full text link
    Considerable efforts in modern statistical physics is devoted to the study of networked systems. One of the most important example of them is the brain, which creates and continuously develops complex networks of correlated dynamics. An important quantity which captures fundamental aspects of brain network organization is the neural complexity C(X)introduced by Tononi et al. This work addresses the dependence of this measure on the topological features of a network in the case of gaussian stationary process. Both anlytical and numerical results show that the degree of complexity has a clear and simple meaning from a topological point of view. Moreover the analytical result offers a straightforward algorithm to compute the complexity than the standard one.Comment: 6 pages, 4 figure

    Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework

    Get PDF
    This paper introduces a time- and state-dependent measure of integrated information, φ, which captures the repertoire of causal states available to a system as a whole. Specifically, φ quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) φ varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) φ varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) φ varies as a function of network architecture. High φ values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high φ because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high φ but are inefficient. (iv) In Hopfield networks, φ is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, φ appears to be a useful metric to characterize the capacity of any physical system to integrate information

    Propagation of first and second sound in a two-dimensional Fermi superfluid

    Full text link
    Sound propagation is a macroscopic manifestation of the interplay between the equilibrium thermodynamics and the dynamical transport properties of fluids. Here, for a two-dimensional system of ultracold fermions, we calculate the first and second sound velocities across the whole BCS-BEC crossover and we analyze the system response to an external perturbation. In the low-temperature regime we reproduce the recent measurements [Phys Rev. Lett. {\bf 124}, 240403 (2020)] of the first sound velocity, which, due to the decoupling of density and entropy fluctuations, is the sole mode excited by a density probe. Conversely, a heat perturbation excites only the second sound, which, being sensitive to the superfluid depletion, vanishes in the deep BCS regime, and jumps discontinuously to zero at the Berezinskii-Kosterlitz-Thouless superfluid transition. A mixing between the modes occurs only in the finite-temperature BEC regime, where our theory converges to the purely bosonic results.Comment: 6 pages, 3 figures; published version, correction of journal referenc

    Modeling Resting-State Functional Networks When the Cortex Falls Asleep: Local and Global Changes

    Get PDF
    The transition from wakefulness to sleep represents the most conspicuous change in behavior and the level of consciousness occurring in the healthy brain. It is accompanied by similarly conspicuous changes in neural dynamics, traditionally exemplified by the change from "desynchronized” electroencephalogram activity in wake to globally synchronized slow wave activity of early sleep. However, unit and local field recordings indicate that the transition is more gradual than it might appear: On one hand, local slow waves already appear during wake; on the other hand, slow sleep waves are only rarely global. Studies with functional magnetic resonance imaging also reveal changes in resting-state functional connectivity (FC) between wake and slow wave sleep. However, it remains unclear how resting-state networks may change during this transition period. Here, we employ large-scale modeling of the human cortico-cortical anatomical connectivity to evaluate changes in resting-state FC when the model "falls asleep” due to the progressive decrease in arousal-promoting neuromodulation. When cholinergic neuromodulation is parametrically decreased, local slow waves appear, while the overall organization of resting-state networks does not change. Furthermore, we show that these local slow waves are structured macroscopically in networks that resemble the resting-state networks. In contrast, when the neuromodulator decrease further to very low levels, slow waves become global and resting-state networks merge into a single undifferentiated, broadly synchronized networ

    Integrated information increases with fitness in the evolution of animats

    Get PDF
    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.Comment: 27 pages, 8 figures, one supplementary figure. Three supplementary video files available on request. Version commensurate with published text in PLoS Comput. Bio

    Homeostatic regulation of sleep in the white-crowned sparrow (Zonotrichia leucophrys gambelii)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Sleep is regulated by both a circadian and a homeostatic process. The homeostatic process reflects the duration of prior wakefulness: the longer one stays awake, the longer and/or more intense is subsequent sleep. In mammals, the best marker of the homeostatic sleep drive is slow wave activity (SWA), the electroencephalographic (EEG) power spectrum in the 0.5–4 Hz frequency range during non-rapid eye movement (NREM) sleep. In mammals, NREM sleep SWA is high at sleep onset, when sleep pressure is high, and decreases progressively to reach low levels in late sleep. Moreover, SWA increases further with sleep deprivation, when sleep also becomes less fragmented (the duration of sleep episodes increases, and the number of brief awakenings decreases). Although avian and mammalian sleep share several features, the evidence of a clear homeostatic response to sleep loss has been conflicting in the few avian species studied so far. The aim of the current study was therefore to ascertain whether established markers of sleep homeostasis in mammals are also present in the white-crowned sparrow (<it>Zonotrichia leucophrys gambelii</it>), a migratory songbird of the order Passeriformes. To accomplish this goal, we investigated amount of sleep, sleep time course, and measures of sleep intensity in 6 birds during baseline sleep and during recovery sleep following 6 hours of sleep deprivation.</p> <p>Results</p> <p>Continuous (24 hours) EEG and video recordings were used to measure baseline sleep and recovery sleep following short-term sleep deprivation. Sleep stages were scored visually based on 4-sec epochs. EEG power spectra (0.5–25 Hz) were calculated on consecutive 4-sec epochs. Four vigilance states were reliably distinguished based on behavior, visual inspection of the EEG, and spectral EEG analysis: Wakefulness (W), Drowsiness (D), slow wave sleep (SWS) and rapid-eye movement (REM) sleep. During baseline, SWA during D, SWS, and NREM sleep (defined as D and SWS combined) was highest at the beginning of the major sleep period and declined thereafter. Moreover, peak SWA in both SWS and NREM sleep increased significantly immediately following sleep deprivation relative to baseline.</p> <p>Conclusion</p> <p>As in mammals, sleep deprivation in the white-crowned sparrow increases the intensity of sleep as measured by SWA.</p

    Dreaming in NREM Sleep: A High-Density EEG Study of Slow Waves and Spindles.

    Get PDF
    Dreaming can occur in both rapid eye movement (REM) and non-REM (NREM) sleep. We recently showed that in both REM and NREM sleep, dreaming is associated with local decreases in slow wave activity (SWA) in posterior brain regions. To expand these findings, here we asked how specific features of slow waves and spindles, the hallmarks of NREM sleep, relate to dream experiences. Fourteen healthy human subjects (10 females) underwent nocturnal high-density EEG recordings combined with a serial awakening paradigm. Reports of dreaming, compared with reports of no experience, were preceded by fewer, smaller, and shallower slow waves, and faster spindles, especially in central and posterior cortical areas. We also identified a minority of very steep and large slow waves in frontal regions, which occurred on a background of reduced SWA and were associated with high-frequency power increases (local "microarousals") heralding the successful recall of dream content. These results suggest that the capacity of the brain to generate experiences during sleep is reduced in the presence of neuronal off-states in posterior and central brain regions, and that dream recall may be facilitated by the intermittent activation of arousal systems during NREM sleep.SIGNIFICANCE STATEMENT By combining high-density EEG recordings with a serial awakening paradigm in healthy subjects, we show that dreaming in non-rapid eye movement sleep occurs when slow waves in central and posterior regions are sparse, small, and shallow. We also identified a small subset of very large and steep frontal slow waves that are associated with high-frequency activity increases (local "microarousals") heralding successful recall of dream content. These results provide noninvasive measures that could represent a useful tool to infer the state of consciousness during sleep
    corecore