1,551 research outputs found

    Heart Rate Variability: A possible machine learning biomarker for mechanical circulatory device complications and heart recovery

    Get PDF
    Cardiovascular disease continues to be the number one cause of death in the United States, with heart failure patients expected to increase to \u3e8 million by 2030. Mechanical circulatory support (MCS) devices are now better able to manage acute and chronic heart failure refractory to medical therapy, both as bridge to transplant or as bridge to destination. Despite significant advances in MCS device design and surgical implantation technique, it remains difficult to predict response to device therapy. Heart rate variability (HRV), measuring the variation in time interval between adjacent heartbeats, is an objective device diagnostic regularly recorded by various MCS devices that has been shown to have significant prognostic value for both sudden cardiac death as well as all-cause mortality in congestive heart failure (CHF) patients. Limited studies have examined HRV indices as promising risk factors and predictors of complication and recovery from left ventricular assist device therapy in end-stage CHF patients. If paired with new advances in machine learning utilization in medicine, HRV represents a potential dynamic biomarker for monitoring and predicting patient status as more patients enter the mechanotrope era of MCS devices for destination therapy

    On the standardization of approximate entropy: multidimensional approximate entropy index evaluated on short-term HRV time series

    Get PDF
    Background. Nonlinear heart rate variability (HRV) indices have extended the description of autonomic nervous system (ANS) regulation of the heart. One of those indices is approximate entropy, ApEn, which has become a commonly used measure of the irregularity of a time series. To calculate ApEn, a priori definition of parameters like the threshold on similarity and the embedding dimension is required, which has been shown to be critical for interpretation of the results. Thus, searching for a parameter-free ApEn-based index could be advantageous for standardizing the use and interpretation of this widely applied entropy measurement. Methods. A novel entropy index called multidimensional approximate entropy, , is proposed based on summing the contribution of maximum approximate entropies over a wide range of embedding dimensions while selecting the similarity threshold leading to maximum ApEn value in each dimension. Synthetic RR interval time series with varying levels of stochasticity, generated by both MIX(P) processes and white/pink noise, were used to validate the properties of the proposed index. Aging and congestive heart failure (CHF) were characterized from RR interval time series of available databases. Results. In synthetic time series, values were proportional to the level of randomness; i.e., increased for higher values of P in generated MIX(P) processes and was larger for white than for pink noise. This result was a consequence of all maximum approximate entropy values being increased for higher levels of randomness in all considered embedding dimensions. This is in contrast to the results obtained for approximate entropies computed with a fixed similarity threshold, which presented inconsistent results for different embedding dimensions. Evaluation of the proposed index on available databases revealed that aging was associated with a notable reduction in values. On the other hand, evaluated during the night period was considerably larger in CHF patients than in healthy subjects. Conclusion. A novel parameter-free multidimensional approximate entropy index, , is proposed and tested over synthetic data to confirm its capacity to represent a range of randomness levels in HRV time series. values are reduced in elderly patients, which may correspond to the reported loss of ANS adaptability in this population segment. Increased values measured in CHF patients versus healthy subjects during the night period point to greater irregularity of heart rate dynamics caused by the disease

    Multi-scale Order Recurrence Plot based deterministic analysis on Heart Rate Variability in Congestive Heart Failure Assessment

    Get PDF
    Congestive heart failure (CHF) is a cardiovascular disease associated with the abnormal autonomic nervous system (ANS). Heart rate variability analysis (HRV) is the main method for the quantitative evaluation of autonomic nervous function. Common analytical methods of HRV include time domain, frequency domain, and nonlinear methods. However, these methods generally ignore the short-term volatility of heart rate and autonomic ganglion law. Therefore, this study proposes a new parameter to analyze heart rate variability-determination of a multi-scale order recurrence plot (MSORP_DET). This method can analyze the HRV in patients with heart failure on multiple time scales. This study analyzed the R-R interval in 24-hour HRV data from 98 samples (54 normal subjects and 44 patients with CHF). The results showed that MSORP_DET could significantly distinguish CHF patients from normal subjects (p<0.001). Moreover, the accuracy rate of screening patients with CHF reached the maximum of 81.6% by using the combination of low frequency/high frequency (LF/HF) and MSORP_DET, compared with 78.6% when using LF/HF alone. Therefore, MSORP_DET can be used as a new index to screen patients with CHF and reveal that the rhythm of ANS in patients with heart failure is more complex than that in normal people

    Complex systems and the technology of variability analysis

    Get PDF
    Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients

    Heart Rate Variability and Non-Linear Dynamics in Risk Stratification

    Get PDF
    The time-domain measures and power–spectral analysis of heart rate variability (HRV) are classic conventional methods to assess the complex regulatory system between autonomic nervous system and heart rate and are most widely used. There are abundant scientific data about the prognostic significance of the conventional measurements of HRV in patients with various conditions, particularly with myocardial infarction. Some studies have suggested that some newer measures describing non-linear dynamics of heart rate, such as fractal measures, may reveal prognostic information beyond that obtained by the conventional measures of HRV. An ideal risk indicator could specifically predict sudden arrhythmic death as the implantable cardioverter-defibrillator (ICD) therapy can prevent such events. There are numerically more sudden deaths among post-infarction patients with better preserved left ventricular function than in those with severe left ventricular dysfunction. Recent data support the concept that HRV measurements, when analyzed several weeks after acute myocardial infarction, predict life-threatening ventricular tachyarrhythmias in patients with moderately depressed left ventricular function. However, well-designed prospective randomized studies are needed to evaluate whether the ICD therapy based on the assessment of HRV alone or with other risk indicators improves the patients’ prognosis. Several issues, such as the optimal target population, optimal timing of HRV measurements, optimal methods of HRV analysis, and optimal cutpoints for different HRV parameters, need clarification before the HRV analysis can be a widespread clinical tool in risk stratification

    Review and classification of variability analysis techniques with clinical applications

    Get PDF
    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis

    Linear and nonlinear analysis of normal and CAD-affected heart rate signals

    Get PDF
    Coronary Artery Disease (CAD) is one of the dangerous cardiac disease, often may lead to sudden cardiac death. It is difficult to diagnose CAD by manual inspection of electrocardiogram (ECG) signals. To automate this detection task, in this study, we extracted the Heart Rate (HR) from the ECG signals and used them as base signal for further analysis. We then analyzed the HR signals of both normal and CAD subjects using (i) time domain, (ii) frequency domain and (iii) nonlinear techniques. The following are the nonlinear methods that were used in this work: Poincare plots, Recurrence Quantification Analysis (RQA) parameters, Shannon entropy, Approximate Entropy (ApEn), Sample Entropy (SampEn), Higher Order Spectra (HOS) methods, Detrended Fluctuation Analysis (DFA), Empirical Mode Decomposition (EMD), Cumulants, and Correlation Dimension. As a result of the analysis, we present unique recurrence, Poincare and HOS plots for normal and CAD subjects. We have also observed significant variations in the range of these features with respect to normal and CAD classes, and have presented the same in this paper. We found that the RQA parameters were higher for CAD subjects indicating more rhythm. Since the activity of CAD subjects is less, similar signal patterns repeat more frequently compared to the normal subjects. The entropy based parameters, ApEn and SampEn, are lower for CAD subjects indicating lower entropy (less activity due to impairment) for CAD. Almost all HOS parameters showed higher values for the CAD group, indicating the presence of higher frequency content in the CAD signals. Thus, our study provides a deep insight into how such nonlinear features could be exploited to effectively and reliably detect the presence of CAD

    Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk

    Get PDF
    Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis

    Electrocardiogram pattern recognition and analysis based on artificial neural networks and support vector machines: a review.

    Get PDF
    Computer systems for Electrocardiogram (ECG) analysis support the clinician in tedious tasks (e.g., Holter ECG monitored in Intensive Care Units) or in prompt detection of dangerous events (e.g., ventricular fibrillation). Together with clinical applications (arrhythmia detection and heart rate variability analysis), ECG is currently being investigated in biometrics (human identification), an emerging area receiving increasing attention. Methodologies for clinical applications can have both differences and similarities with respect to biometrics. This paper reviews methods of ECG processing from a pattern recognition perspective. In particular, we focus on features commonly used for heartbeat classification. Considering the vast literature in the field and the limited space of this review, we dedicated a detailed discussion only to a few classifiers (Artificial Neural Networks and Support Vector Machines) because of their popularity; however, other techniques such as Hidden Markov Models and Kalman Filtering will be also mentioned
    corecore