14 research outputs found

    Distances in probability space and the statistical complexity setup

    Get PDF
    Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a "disequilibrium" and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics.Facultad de Ciencias Exacta

    Distances in probability space and the statistical complexity setup

    Get PDF
    Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a "disequilibrium" and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics.Facultad de Ciencias Exacta

    Distances in probability space and the statistical complexity setup

    Get PDF
    Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a "disequilibrium" and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics.Facultad de Ciencias Exacta

    Information Length Analysis of Linear Autonomous Stochastic Processes

    Get PDF
    When studying the behaviour of complex dynamical systems, a statistical formulation can provide useful insights. In particular, information geometry is a promising tool for this purpose. In this paper, we investigate the information length for n-dimensional linear autonomous stochastic processes, providing a basic theoretical framework that can be applied to a large set of problems in engineering and physics. A specific application is made to a harmonically bound particle system with the natural oscillation frequency ω, subject to a damping γ and a Gaussian white-noise. We explore how the information length depends on ω and γ, elucidating the role of critical damping γ=2ω in information geometry. Furthermore, in the long time limit, we show that the information length reflects the linear geometry associated with the Gaussian statistics in a linear stochastic process

    Continuous wavelet transform in the study of the time-scale properties of intracranial pressure in hydrocephalus

    Get PDF
    Producción CientíficaNormal pressure hydrocephalus (NPH) encompasses a heterogeneous group of disorders generally characterised by clinical symptoms, ventriculomegaly and anomalous cerebrospinal fluid (CSF) dynamics. Lumbar infusion tests (ITs) are frequently performed in the preoperatory evaluation of patients who show NPH features. The analysis of intracranial pressure (ICP) signals recorded during ITs could be useful to better understand the pathophysiology underlying NPH and to assist treatment decisions. In this study, 131 ICP signals recorded during ITs were analysed using two continuous wavelet transform (CWT)-derived parameters: Jensen Divergence (JD) and Spectral Flux (SF). These parameters were studied in two frequency bands, associated with different components of the signal: "(0.15 - 0.3 Hz), related to respiratory blood pressure oscillations; and # (0.67 - 2.5 Hz), related to ICP pulse waves. Statistically significant differences ( < 1.70 ∙ 10+,, Bonferroni-corrected Wilcoxon signed rank tests) in pairwise comparisons between phases of ITs were found using the mean and standard deviation of JD and SF. These differences were mainly found in #, where a lower irregularity and variability, together with less prominent time-frequency fluctuations, were found in the hypertension phase of ITs. Our results suggest that wavelet analysis could be useful for understanding CSF dynamics in NPH.This research was supported by ‘Ministerio de Economía y Competitividad’ and 'European Regional Development Fund' (FEDER) under project TEC2014-53196-R, by ‘European Commission’ and FEDER under project 'Análisis y correlación entre el genoma completo y la actividad cerebral para la ayuda en el diagnóstico de la enfermedad de Alzheimer' ('Cooperation Programme Interreg V-A Spain-Portugal POCTEP 2014-2020'), and by ‘Consejería de Educación de la Junta de Castilla y León’ and FEDER under project VA037U16

    Permutation entropy and its main biomedical and econophysics applications: a review

    Get PDF
    Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems.Facultad de Ingenierí

    Information Geometry, Fluctuations, Non-Equilibrium Thermodynamics, and Geodesics in Complex Systems

    Get PDF
    Information theory provides an interdisciplinary method to understand important phenomena in many research fields ranging from astrophysical and laboratory fluids/plasmas to biological systems. In particular, information geometric theory enables us to envision the evolution of non-equilibrium processes in terms of a (dimensionless) distance by quantifying how information unfolds over time as a probability density function (PDF) evolves in time. Here, we discuss some recent developments in information geometric theory focusing on time-dependent dynamic aspects of non-equilibrium processes (e.g., time-varying mean value, time-varying variance, or temperature, etc.) and their thermodynamic and physical/biological implications. We compare different distances between two given PDFs and highlight the importance of a path-dependent distance for a time-dependent PDF. We then discuss the role of the information rate Γ=dLdt and relative entropy in non-equilibrium thermodynamic relations (entropy production rate, heat flux, dissipated work, non-equilibrium free energy, etc.), and various inequalities among them. Here, L is the information length representing the total number of statistically distinguishable states a PDF evolves through over time. We explore the implications of a geodesic solution in information geometry for self-organization and control

    A Fractional Entropy in Fractal Phase Space: Properties and Characterization

    Get PDF
    corecore