72 research outputs found

    Concomitant evaluation of cardiovascular and cerebrovascular controls via Geweke spectral causality to assess the propensity to postural syncope

    Get PDF
    The evaluation of propensity to postural syncope necessitates the concomitant characterization of the cardiovascular and cerebrovascular controls and a method capable of disentangling closed loop relationships and decomposing causal links in the frequency domain. We applied Geweke spectral causality (GSC) to assess cardiovascular control from heart period and systolic arterial pressure variability and cerebrovascular regulation from mean arterial pressure and mean cerebral blood velocity variability in 13 control subjects and 13 individuals prone to develop orthostatic syncope. Analysis was made at rest in supine position and during head-up tilt at 60°, well before observing presyncope signs. Two different linear model structures were compared, namely bivariate autoregressive and bivariate dynamic adjustment classes. We found that (i) GSC markers did not depend on the model structure; (ii) the concomitant assessment of cardiovascular and cerebrovascular controls was useful for a deeper comprehension of postural disturbances; (iii) orthostatic syncope appeared to be favored by the loss of a coordinated behavior between the baroreflex feedback and mechanical feedforward pathway in the frequency band typical of the baroreflex functioning during the postural challenge, and by a weak cerebral autoregulation as revealed by the increased strength of the pressure-to-flow link in the respiratory band. GSC applied to spontaneous cardiovascular and cerebrovascular oscillations is a promising tool for describing and monitoring disturbances associated with posture modification

    A New Framework for the Time- and Frequency-Domain Assessment of High-Order Interactions in Networks of Random Processes

    Get PDF
    While the standard network description of complex systems is based on quantifying the link between pairs of system units, higher-order interactions (HOIs) involving three or more units often play a major role in governing the collective network behavior. This work introduces a new approach to quantify pairwise and HOIs for multivariate rhythmic processes interacting across multiple time scales. We define the so-called O-information rate (OIR) as a new metric to assess HOIs for multivariate time series, and present a framework to decompose the OIR into measures quantifying Granger-causal and instantaneous influences, as well as to expand all measures in the frequency domain. The framework exploits the spectral representation of vector autoregressive and state space models to assess the synergistic and redundant interaction among groups of processes, both in specific bands of interest and in the time domain after whole-band integration. Validation of the framework on simulated networks illustrates how the spectral OIR can highlight redundant and synergistic HOIs emerging at specific frequencies, which cannot be detected using time-domain measures. The applications to physiological networks described by heart period, arterial pressure and respiration variability measured in healthy subjects during a protocol of paced breathing, and to brain networks described by electrocorticographic signals acquired in an animal experiment during anesthesia, document the capability of our approach to identify informational circuits relevant to well-defined cardiovascular oscillations and brain rhythms and related to specific physiological mechanisms involving autonomic control and altered consciousness. The proposed framework allows a hierarchically-organized evaluation of timeand frequency-domain interactions in dynamic networks mapped by multivariate time series, and its high flexibility and scalability make it suitable for the investigation of networks beyond pairwise interactions in neuroscience, physiology and many other fields

    Comparison of discretization strategies for the model-free information-theoretic assessment of short-term physiological interactions

    Get PDF
    This work presents a comparison between different approaches for the model-free estimation of information-theoretic measures of the dynamic coupling between short realizations of random processes. The measures considered are the mutual information rate (MIR) between two random processes X and Y and the terms of its decomposition evidencing either the individual entropy rates of X and Y and their joint entropy rate, or the transfer entropies from X to Y and from Y to X and the instantaneous information shared by X and Y. All measures are estimated through discretization of the random variables forming the processes, performed either via uniform quantization (binning approach) or rank ordering (permutation approach). The binning and permutation approaches are compared on simulations of two coupled non-identical Hènon systems and on three datasets, including short realizations of cardiorespiratory (CR, heart period and respiration flow), cardiovascular (CV, heart period and systolic arterial pressure), and cerebrovascular (CB, mean arterial pressure and cerebral blood flow velocity) measured in different physiological conditions, i.e., spontaneous vs paced breathing or supine vs upright positions. Our results show that, with careful selection of the estimation parameters (i.e., the embedding dimension and the number of quantization levels for the binning approach), meaningful patterns of the MIR and of its components can be achieved in the analyzed systems. On physiological time series, we found that paced breathing at slow breathing rates induces less complex and more coupled CR dynamics, while postural stress leads to unbalancing of CV interactions with prevalent baroreflex coupling and to less complex pressure dynamics with preserved CB interactions. These results are better highlighted by the permutation approach, thanks to its more parsimonious representation of the discretized dynamic patterns, which allows one to explore interactions with longer memory while limiting the curse of dimensionality

    Bits from Biology for Computational Intelligence

    Get PDF
    Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the necessary introduction to information theory and the estimation of information theoretic quantities from neural data. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is decomposed into component processes of information storage, transfer, and modification -- locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems

    Connectivity Analysis in EEG Data: A Tutorial Review of the State of the Art and Emerging Trends

    Get PDF
    Understanding how different areas of the human brain communicate with each other is a crucial issue in neuroscience. The concepts of structural, functional and effective connectivity have been widely exploited to describe the human connectome, consisting of brain networks, their structural connections and functional interactions. Despite high-spatial-resolution imaging techniques such as functional magnetic resonance imaging (fMRI) being widely used to map this complex network of multiple interactions, electroencephalographic (EEG) recordings claim high temporal resolution and are thus perfectly suitable to describe either spatially distributed and temporally dynamic patterns of neural activation and connectivity. In this work, we provide a technical account and a categorization of the most-used data-driven approaches to assess brain-functional connectivity, intended as the study of the statistical dependencies between the recorded EEG signals. Different pairwise and multivariate, as well as directed and non-directed connectivity metrics are discussed with a pros-cons approach, in the time, frequency, and information-theoretic domains. The establishment of conceptual and mathematical relationships between metrics from these three frameworks, and the discussion of novel methodological approaches, will allow the reader to go deep into the problem of inferring functional connectivity in complex networks. Furthermore, emerging trends for the description of extended forms of connectivity (e.g., high-order interactions) are also discussed, along with graph-theory tools exploring the topological properties of the network of connections provided by the proposed metrics. Applications to EEG data are reviewed. In addition, the importance of source localization, and the impacts of signal acquisition and pre-processing techniques (e.g., filtering, source localization, and artifact rejection) on the connectivity estimates are recognized and discussed. By going through this review, the reader could delve deeply into the entire process of EEG pre-processing and analysis for the study of brain functional connectivity and learning, thereby exploiting novel methodologies and approaches to the problem of inferring connectivity within complex networks

    Assessment of Cardiorespiratory Interactions during Apneic Events in Sleep via Fuzzy Kernel Measures of Information Dynamics

    Get PDF
    Apnea and other breathing-related disorders have been linked to the development of hypertension or impairments of the cardiovascular, cognitive or metabolic systems. The combined assessment of multiple physiological signals acquired during sleep is of fundamental importance for providing additional insights about breathing disorder events and the associated impairments. In this work, we apply information-theoretic measures to describe the joint dynamics of cardiorespiratory physiological processes in a large group of patients reporting repeated episodes of hypopneas, apneas (central, obstructive, mixed) and respiratory effort related arousals (RERAs). We analyze the heart period as the target process and the airflow amplitude as the driver, computing the predictive information, the information storage, the information transfer, the internal information and the cross information, using a fuzzy kernel entropy estimator. The analyses were performed comparing the information measures among segments during, immediately before and after the respiratory event and with control segments. Results highlight a general tendency to decrease of predictive information and information storage of heart period, as well as of cross information and information transfer from respiration to heart period, during the breathing disordered events. The information-theoretic measures also vary according to the breathing disorder, and significant changes of information transfer can be detected during RERAs, suggesting that the latter could represent a risk factor for developing cardiovascular diseases. These findings reflect the impact of different sleep breathing disorders on respiratory sinus arrhythmia, suggesting overall higher complexity of the cardiac dynamics and weaker cardiorespiratory interactions which may have physiological and clinical relevance

    Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Get PDF
    We introduce the \texttt{pyunicorn} (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. \texttt{pyunicorn} is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics or network surrogates. Additionally, \texttt{pyunicorn} provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis (RQA), recurrence networks, visibility graphs and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.Comment: 28 pages, 17 figure

    From Global to local Functional Connectivity:Application to Listening Effort

    Get PDF

    Discovering Causal Relations and Equations from Data

    Full text link
    Physics is a field of science that has traditionally used the scientific method to answer questions about why natural phenomena occur and to make testable models that explain the phenomena. Discovering equations, laws and principles that are invariant, robust and causal explanations of the world has been fundamental in physical sciences throughout the centuries. Discoveries emerge from observing the world and, when possible, performing interventional studies in the system under study. With the advent of big data and the use of data-driven methods, causal and equation discovery fields have grown and made progress in computer science, physics, statistics, philosophy, and many applied fields. All these domains are intertwined and can be used to discover causal relations, physical laws, and equations from observational data. This paper reviews the concepts, methods, and relevant works on causal and equation discovery in the broad field of Physics and outlines the most important challenges and promising future lines of research. We also provide a taxonomy for observational causal and equation discovery, point out connections, and showcase a complete set of case studies in Earth and climate sciences, fluid dynamics and mechanics, and the neurosciences. This review demonstrates that discovering fundamental laws and causal relations by observing natural phenomena is being revolutionised with the efficient exploitation of observational data, modern machine learning algorithms and the interaction with domain knowledge. Exciting times are ahead with many challenges and opportunities to improve our understanding of complex systems.Comment: 137 page

    Measuring spectrally resolved information processing in neural data

    Get PDF
    Background: The human brain, an incredibly complex biological system comprising billions of neurons and trillions of synapses, possesses remarkable capabilities for information processing and distributed computations. Neurons, the fundamental building blocks, perform elementary operations on their inputs and collaborate extensively to execute intricate computations, giving rise to cognitive functions and behavior. Notably, distributed information processing in the brain heavily relies on rhythmic neural activity characterized by synchronized oscillations at specific frequencies. These oscillations play a crucial role in coordinating brain activity and facilitating communication between different neural circuits [1], effectively acting as temporal windows that enable efficient information exchange within specific frequency ranges. To understand distributed information processing in neural systems, breaking down its components, i.e., —information transfer, storage, and modification can be helpful, but requires precise mathematical definitions for each respective component. Thankfully, these definitions have recently become available [2]. Information theory is a natural choice for measuring information processing, as it offers a mathematically complete description of the concept of information and communication. The fundamental information-processing operations, are considered essential prerequisites for achieving universal information processing in any system [3]. By quantifying and analyzing these operations, we gain valuable insights into the brain’s complex computation and cognitive abilities. As information processing in the brain is intricately tied to rhythmic behavior, there is a need to establish a connection between information theoretic measures and frequency components. Previous attempts to achieve frequency-resolved information theoretic measures have mostly relied on narrowband filtering [4], which comes with several known issues of phase shifting and high false positive rate results [5], or simplifying the computation to few variables [6], that might result in missing important information in the analysed brain signals. Therefore, the current work aims to establish a frequency-resolved measure of two crucial components of information processing: information transfer and information storage. By proposing methodological advancements, this research seeks to shed light on the role of neural oscillations in information processing within the brain. Furthermore, a more comprehensive investigation was carried out on the communication between two critical brain regions responsible for motor inhibition in the frontal cortex (right Inferior Frontal gyrus (rIFG) and pre-Supplementary motor cortex (pre-SMA)). Here, neural oscillations in the beta band (12 − 30 Hz) have been proposed to have a pivotal role in response inhibition. A long-standing question in the field was to disentangle which of these two brain areas first signals the stopping process and drives the other [7]. Furthermore, it was hypothesized that beta oscillations carry the information transfer between these regions. The present work addresses the methodological problems and investigates spectral information processing in neural data, in three studies. Study 1 focuses on the critical role of information transfer, measured by transfer entropy, in distributed computation. Understanding the patterns of information transfer is essential for unraveling the computational algorithms in complex systems, such as the brain. As many natural systems rely on rhythmic processes for distributed computations, a frequency-resolved measure of information transfer becomes highly valuable. To address this, a novel algorithm is presented, efficiently identifying frequencies responsible for sending and receiving information in a network. The approach utilizes the invertible maximum overlap discrete wavelet transform (MODWT) to create surrogate data for computing transfer entropy, eliminating issues associated with phase shifts and filtering. However, measuring frequency-resolved information transfer poses a Partial information decomposition problem [8] that is yet to be fully resolved. The algorithm’s performance is validated using simulated data and applied to human magnetoencephalography (MEG) and ferret local field potential recordings (LFP). In human MEG, the study unveils a complex spectral configuration of cortical information transmission, showing top-down information flow from very high frequencies (above 100Hz) to both similarly high frequencies and frequencies around 20Hz in the temporal cortex. Contrary to the current assumption, the findings suggest that low frequencies do not solely send information to high frequencies. In the ferret LFP, the prefrontal cortex demonstrates the transmission of information at low frequencies, specifically within the range of 4-8 Hz. On the receiving end, V1 exhibits a preference for operating at very high frequency > 125 Hz. The spectrally resolved transfer entropy promises to deepen our understanding of rhythmic information exchange in natural systems, shedding light on the computational properties of oscillations on cognitive functions. In study 2, the primary focus lay on the second fundamental aspect of information processing: the active information storage (AIS). The AIS estimates how much information in the next measurements of the process can be predicted by examining its paste state. In processes that either produce little information (low entropy) or that are highly unpredictable, the AIS is low, whereas processes that are predictable but visit many different states with equal probabilities, exhibit high AIS [9]. Within this context, we introduced a novel spectrally-resolved AIS. Utilizing intracortical recordings of neural activity in anesthetized ferrets before and after loss of consciousness (LOC), the study reveals that the modulation of AIS by anesthesia is highly specific to different frequency bands, cortical layers, and brain regions. The findings reveal that the effects of anesthesia on AIS are prominent in the supragranular layers for the high/low gamma band, while the alpha/beta band exhibits the strongest decrease in AIS at infragranular layers, in accordance with the predictive coding theory. Additionally, the isoflurane impacts local information processing in a frequency-specific manner. For instance, increases in isoflurane concentration lead to a decrease in AIS in the alpha frequency but to an increase in AIS in the delta frequency range (<2Hz). In sum, analyzing spectrally-resolved AIS provides valuable insights into changes in cortical information processing under anesthesia. With rhythmic neural activity playing a significant role in biological neural systems, the introduction of frequency-specific components in active information storage allows a deeper understanding of local information processing in different brain areas and under various conditions. In study 3, to further verify the pivotal role of neural oscillations in information processing, we investigated the neural network mechanisms underlying response inhibition. A long-standing debate has centered around identifying the cortical initiator of response inhibition in the beta band, with two main regions proposed: the right rIFG and the pre-SMA. This third study aimed to determine which of these regions is activated first and exerts a potential information exchange on the other. Using high temporal resolution magnetoencephalography (MEG) and a relatively large cohort of subjects. A significant breakthrough is achieved by demonstrating that the rIFG is activated significantly earlier than the pre-SMA. The onset of beta band activity in the rIFG occurred at around 140 ms after the STOP signal. Further analyses showed that the beta-band activity in the rIFG was crucial for successful stopping, as evidenced by its predictive value for stopping performance. Connectivity analysis revealed that the rIFG sends information in the beta band to the pre-SMA but not vice versa, emphasizing the rIFG’s dominance in the response inhibition process. The results provide strong support for the hypothesis that the rIFG initiates stopping and utilizes beta-band oscillations for this purpose. These findings have significant implications, suggesting the possibility of spatially localized oscillation based interventions for response inhibition. Conclusion: In conclusion, the present work proposes a novel algorithm for uncovering the frequencies at which information is transferred between sources and targets in the brain, providing valuable insights into the computational dynamics of neural processes. The spectrally resolved transfer entropy was successfully applied to experimental neural data of intracranial recordings in ferrets and MEG recordings of humans. Furthermore, the study on active information storage (AIS) analysis under anesthesia revealed that the spectrally resolved AIS offers unique additional insights beyond traditional spectral power analysis. By examining changes in neural information processing, the study demonstrates how AIS analysis can deepen the understanding of anesthesia’s effects on cortical information processing. Moreover, the third study’s findings provide strong evidence supporting the critical role of beta oscillations in information processing, particularly in response inhibition. The research successfully demonstrates that beta oscillations in the rIFG functions as the key initiator of the response inhibition process, acting as a top-down control mechanism. The identification of beta oscillations as a crucial factor in information processing opens possibilities for further research and targeted interventions in neurological disorders. Taken together, the current work highlights the role of spectrally-resolved information processing in neural systems by not only introducing novel algorithms, but also successfully applying them to experimental oscillatory neural activity in relation to low-level cortical information processing (anesthesia) as well as high-level processes (cognitive response inhibition)
    • …
    corecore