783 research outputs found

    Use of Multiscale Entropy to Characterize Fetal Autonomic Development

    Get PDF
    The idea that uterine environment and adverse events during fetal development could increase the chances of the diseases in adulthood was first published by David Barker in 1998. Since then, investigators have been employing several methods and methodologies for studying and characterizing the ontological development of the fetus, e.g., fetal movement, growth and cardiac metrics. Even with most recent and developed methods such as fetal magnetocardiography (fMCG), investigators are continuously challenged to study fetal development; the fetus is inaccessible. Finding metrics that realize the full capacity of characterizing fetal ontological development remains a technological challenge. In this thesis, the use and value of multiscale entropy to characterize fetal maturation across third trimester of gestation is studied. Using multiscale entropy obtained from participants of a clinical trial, we show that MSE can characterize increasing complexity due to maturation in the fetus, and can distinguish a growing and developing fetal system from a mature system where loss of irregularity is due to compromised complexity from increasing physiologic load. MSE scales add a nonlinear metric that seems to accurately reflect the ontological development of the fetus and hold promise for future use to investigate the effects of maternal stress, intrauterine growth restriction, or predict risk for sudden infant death syndrome

    Effects of ECG Data Length on Heart Rate Variability Among Young Healthy Adults

    Get PDF
    The relationship between the robustness of HRV derived by linear and nonlinear methods to the required minimum data lengths has yet to be well understood. The normal electrocardiography (ECG) data of 14 healthy volunteers were applied to 34 HRV measures using various data lengths, and compared with the most prolonged (2000 R peaks or 750 s) by using the Mann–Whitney U test, to determine the 0.05 level of significance. We found that SDNN, RMSSD, pNN50, normalized LF, the ratio of LF and HF, and SD1 of the Poincaré plot could be adequately computed by small data size (60–100 R peaks). In addition, parameters of RQA did not show any significant differences among 60 and 750 s. However, longer data length (1000 R peaks) is recommended to calculate most other measures. The DFA and Lyapunov exponent might require an even longer data length to show robust results. Conclusions: Our work suggests the optimal minimum data sizes for different HRV measures which can potentially improve the efficiency and save the time and effort for both patients and medical care providers

    On the standardization of approximate entropy: multidimensional approximate entropy index evaluated on short-term HRV time series

    Get PDF
    Background. Nonlinear heart rate variability (HRV) indices have extended the description of autonomic nervous system (ANS) regulation of the heart. One of those indices is approximate entropy, ApEn, which has become a commonly used measure of the irregularity of a time series. To calculate ApEn, a priori definition of parameters like the threshold on similarity and the embedding dimension is required, which has been shown to be critical for interpretation of the results. Thus, searching for a parameter-free ApEn-based index could be advantageous for standardizing the use and interpretation of this widely applied entropy measurement. Methods. A novel entropy index called multidimensional approximate entropy, , is proposed based on summing the contribution of maximum approximate entropies over a wide range of embedding dimensions while selecting the similarity threshold leading to maximum ApEn value in each dimension. Synthetic RR interval time series with varying levels of stochasticity, generated by both MIX(P) processes and white/pink noise, were used to validate the properties of the proposed index. Aging and congestive heart failure (CHF) were characterized from RR interval time series of available databases. Results. In synthetic time series, values were proportional to the level of randomness; i.e., increased for higher values of P in generated MIX(P) processes and was larger for white than for pink noise. This result was a consequence of all maximum approximate entropy values being increased for higher levels of randomness in all considered embedding dimensions. This is in contrast to the results obtained for approximate entropies computed with a fixed similarity threshold, which presented inconsistent results for different embedding dimensions. Evaluation of the proposed index on available databases revealed that aging was associated with a notable reduction in values. On the other hand, evaluated during the night period was considerably larger in CHF patients than in healthy subjects. Conclusion. A novel parameter-free multidimensional approximate entropy index, , is proposed and tested over synthetic data to confirm its capacity to represent a range of randomness levels in HRV time series. values are reduced in elderly patients, which may correspond to the reported loss of ANS adaptability in this population segment. Increased values measured in CHF patients versus healthy subjects during the night period point to greater irregularity of heart rate dynamics caused by the disease

    Review and classification of variability analysis techniques with clinical applications

    Get PDF
    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis

    A Smart Service Platform for Cost Efficient Cardiac Health Monitoring

    Get PDF
    Aim: In this study we have investigated the problem of cost effective wireless heart health monitoring from a service design perspective. Subject and Methods: There is a great medical and economic need to support the diagnosis of a wide range of debilitating and indeed fatal non-communicable diseases, like Cardiovascular Disease (CVD), Atrial Fibrillation (AF), diabetes, and sleep disorders. To address this need, we put forward the idea that the combination of Heart Rate (HR) measurements, Internet of Things (IoT), and advanced Artificial Intelligence (AI), forms a Heart Health Monitoring Service Platform (HHMSP). This service platform can be used for multi-disease monitoring, where a distinct service meets the needs of patients having a specific disease. The service functionality is realized by combining common and distinct modules. This forms the technological basis which facilitates a hybrid diagnosis process where machines and practitioners work cooperatively to improve outcomes for patients. Results: Human checks and balances on independent machine decisions maintain safety and reliability of the diagnosis. Cost efficiency comes from efficient signal processing and replacing manual analysis with AI based machine classification. To show the practicality of the proposed service platform, we have implemented an AF monitoring service. Conclusion: Having common modules allows us to harvest the economies of scale. That is an advantage, because the fixed cost for the infrastructure is shared among a large group of customers. Distinct modules define which AI models are used and how the communication with practitioners, caregivers and patients is handled. That makes the proposed HHMSP agile enough to address safety, reliability and functionality needs from healthcare providers

    Heart rate variability characterization using entropy measures

    Get PDF
    Tese de mestrado. Engenharia Biomédica. Faculdade de Engenharia. Universidade do Porto. 200

    Heart rate variability : a fractal analysis

    Get PDF
    Tese de mestrado. Engenharia Biomédica. Faculdade de Engenharia. Universidade do Porto. 200
    corecore