53 research outputs found

    A new entropy measure based on the Renyi entropy rate using Gaussian kernels

    Get PDF
    The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply it directly to finite real world data sets. With this in mind, Pincus developed Approximate Entropy (ApEn), which uses ideas from Eckmann and Ruelle to create a regularity measure based on entropy rate that can be used to determine the influence of chaotic behaviour in a real world signal. However, this measure was found not to be robust and so an improved formulation known as the Sample Entropy (SampEn) was created by Richman and Moorman to address these issues. We have developed a new, related, regularity measure which is not based on the theory provided by Eckmann and Ruelle and proves a more well-behaved measure of complexity than the previous measures whilst still retaining a low computational cost

    Método automático de análisis de señales acústicas no-estacionarias

    Get PDF
    Método automático de análisis de señales acústicas noestacionarias. Esta patente de invención presenta un método de análisis de la señal acústica temporal generada durante determinados eventos o sucesos de cambio de estado o transformaciones en todo tipo de sistemas físicos caracterizada por presentar una distribución tiempo-frecuencia cualesquiera. Un método basado en un algoritmo matemático permite, después de varias etapas intermedias, la determinación univoca de la cantidad de información (entropía) a partir de un espectro sonoro. Este procedimiento es aplicable a todo sistema de cálculo automático y es de especial interés en ciencias biomédicas, es válido en la determinación de fracturas en estructuras metálicas así como en la prevención de movimientos sísmicos. La presente invención constituye una mejora en los métodos actualmente operativos.Consejo Superior de Investigaciones Científicas (España)B1 Patente con informe sobre el estado de la ténic

    Classification of Alzheimer’s disease from quadratic sample entropy of electroencephalogram

    Get PDF
    Currently accepted input parameter limitations in entropy-based, non-linear signal processing methods, for example, sample entropy (SampEn), may limit the information gathered from tested biological signals. The ability of quadratic sample entropy (QSE) to identify changes in electroencephalogram (EEG) signals of 11 patients with a diagnosis of Alzheimer's disease (AD) and 11 age-matched, healthy controls is investigated. QSE measures signal regularity, where reduced QSE values indicate greater regularity. The presented method allows a greater range of QSE input parameters to produce reliable results than SampEn. QSE was lower in AD patients compared with controls with significant differences (p < 0.01) for different parameter combinations at electrodes P3, P4, O1 and O2. Subject- and epoch-based classifications were tested with leave-one-out linear discriminant analysis. The maximum diagnostic accuracy and area under the receiver operating characteristic curve were 77.27 and more than 80%, respectively, at many parameter and electrode combinations. Furthermore, QSE results across all r values were consistent, suggesting QSE is robust for a wider range of input parameters than SampEn. The best results were obtained with input parameters outside the acceptable range for SampEn, and can identify EEG changes between AD patients and controls. However, caution should be applied because of the small sample size

    Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep

    Get PDF
    Abstract: Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities

    A Computational Framework to Support the Automated Analysis of Routine Electroencephalographic Data

    Get PDF
    Epilepsy is a condition in which a patient has multiple unprovoked seizures which are not precipitated by another medical condition. It is a common neurological disorder that afflicts 1% of the population of the US, and is sometimes hard to diagnose if seizures are infrequent. Routine Electroencephalography (rEEG), where the electrical potentials of the brain are recorded on the scalp of a patient, is one of the main tools for diagnosing because rEEG can reveal indicators of epilepsy when patients are in a non-seizure state. Interpretation of rEEG is difficult and studies have shown that 20-30% of patients at specialized epilepsy centers are misdiagnosed. An improved ability to interpret rEEG could decrease the misdiagnosis rate of epilepsy. The difficulty in diagnosing epilepsy from rEEG stems from the large quantity, low signal to noise ratio (SNR), and variability of the data. A usual point of error for a clinician interpreting rEEG data is the misinterpretation of PEEs (paroxysmal EEG events) ( short bursts of electrical activity of high amplitude relative to the surrounding signals that have a duration of approximately .1 to 2 seconds). Clinical interpretation of PEEs could be improved with the development of an automated system to detect and classify PEE activity in an rEEG dataset. Systems that have attempted to automatically classify PEEs in the past have had varying degrees of success. These efforts have been hampered to a large extent by the absence of a \gold standard\u27 data set that EEG researchers could use. In this work we present a distributed, web-based collaborative system for collecting and creating a gold standard dataset for the purpose of evaluating spike detection software. We hope to advance spike detection research by creating a performance standard that facilitates comparisons between approaches of disparate research groups. Further, this work endeavors to create a new, high performance parallel implementation of ICA (independent component analysis), a potential preprocessing step for PEE classification. We also demonstrate tools for visualization and analysis to support the initial phases of spike detection research. These tools will first help to develop a standardized rEEG dataset of expert EEG interpreter opinion with which automated analysis can be trained and tested. Secondly, it will attempt to create a new framework for interdisciplinary research that will help improve our understanding of PEEs in rEEG. These improvements could ultimately advance the nuanced art of rEEG interpretation and decrease the misdiagnosis rate that leads to patients suering inappropriate treatment

    FEATURE EXTRACTION AND CLASSIFICATION THROUGH ENTROPY MEASURES

    Get PDF
    Entropy is a universal concept that represents the uncertainty of a series of random events. The notion \u201centropy" is differently understood in different disciplines. In physics, it represents the thermodynamical state variable; in statistics it measures the degree of disorder. On the other hand, in computer science, it is used as a powerful tool for measuring the regularity (or complexity) in signals or time series. In this work, we have studied entropy based features in the context of signal processing. The purpose of feature extraction is to select the relevant features from an entity. The type of features depends on the signal characteristics and classification purpose. Many real world signals are nonlinear and nonstationary and they contain information that cannot be described by time and frequency domain parameters, instead they might be described well by entropy. However, in practice, estimation of entropy suffers from some limitations and is highly dependent on series length. To reduce this dependence, we have proposed parametric estimation of various entropy indices and have derived analytical expressions (when possible) as well. Then we have studied the feasibility of parametric estimations of entropy measures on both synthetic and real signals. The entropy based features have been finally employed for classification problems related to clinical applications, activity recognition, and handwritten character recognition. Thus, from a methodological point of view our study deals with feature extraction, machine learning, and classification methods. The different versions of entropy measures are found in the literature for signals analysis. Among them, approximate entropy (ApEn), sample entropy (SampEn) followed by corrected conditional entropy (CcEn) are mostly used for physiological signals analysis. Recently, entropy features are used also for image segmentation. A related measure of entropy is Lempel-Ziv complexity (LZC), which measures the complexity of a time-series, signal, or sequences. The estimation of LZC also relies on the series length. In particular, in this study, analytical expressions have been derived for ApEn, SampEn, and CcEn of an auto-regressive (AR) models. It should be mentioned that AR models have been employed for maximum entropy spectral estimation since many years. The feasibility of parametric estimates of these entropy measures have been studied on both synthetic series and real data. In feasibility study, the agreement between numeral estimates of entropy and estimates obtained through a certain number of realizations of the AR model using Montecarlo simulations has been observed. This agreement or disagreement provides information about nonlinearity, nonstationarity, or nonGaussinaity presents in the series. In some classification problems, the probability of agreement or disagreement have been proved as one of the most relevant features. VII After feasibility study of the parametric entropy estimates, the entropy and related measures have been applied in heart rate and arterial blood pressure variability analysis. The use of entropy and related features have been proved more relevant in developing sleep classification, handwritten character recognition, and physical activity recognition systems. The novel methods for feature extraction researched in this thesis give a good classification or recognition accuracy, in many cases superior to the features reported in the literature of concerned application domains, even with less computational costs

    Nonlinear time-series analysis revisited

    Full text link
    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data---typically univariate---via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems

    Automated Segmentation of Left and Right Ventricles in MRI and Classification of the Myocarfium Abnormalities

    Get PDF
    A fundamental step in diagnosis of cardiovascular diseases, automated left and right ventricle (LV and RV) segmentation in cardiac magnetic resonance images (MRI) is still acknowledged to be a difficult problem. Although algorithms for LV segmentation do exist, they require either extensive training or intensive user inputs. RV segmentation in MRI has yet to be solved and is still acknowledged a completely unsolved problem because its shape is not symmetric and circular, its deformations are complex and varies extensively over the cardiac phases, and it includes papillary muscles. In this thesis, I investigate fast detection of the LV endo- and epi-cardium surfaces (3D) and contours (2D) in cardiac MRI via convex relaxation and distribution matching. A rapid 3D segmentation of the RV in cardiac MRI via distribution matching constraints on segment shape and appearance is also investigated. These algorithms only require a single subject for training and a very simple user input, which amounts to one click. The solution is sought following the optimization of functionals containing probability product kernel constraints on the distributions of intensity and geometric features. The formulations lead to challenging optimization problems, which are not directly amenable to convex-optimization techniques. For each functional, the problem is split into a sequence of sub-problems, each of which can be solved exactly and globally via a convex relaxation and the augmented Lagrangian method. Finally, an information-theoretic based artificial neural network (ANN) is proposed for normal/abnormal LV myocardium motion classification. Using the LV segmentation results, the LV cavity points is estimated via a Kalman filter and a recursive dynamic Bayesian filter. However, due to the similarities between the statistical information of normal and abnormal points, differentiating between distributions of abnormal and normal points is a challenging problem. The problem was investigated with a global measure based on the Shannon\u27s differential entropy (SDE) and further examined with two other information-theoretic criteria, one based on Renyi entropy and the other on Fisher information. Unlike the existing information-theoretic studies, the approach addresses explicitly the overlap between the distributions of normal and abnormal cases, thereby yielding a competitive performance. I further propose an algorithm based on a supervised 3-layer ANN to differentiate between the distributions farther. The ANN is trained and tested by five different information measures of radial distance and velocity for points on endocardial boundary
    • …
    corecore