223 research outputs found
Fog Computing in Medical Internet-of-Things: Architecture, Implementation, and Applications
In the era when the market segment of Internet of Things (IoT) tops the chart
in various business reports, it is apparently envisioned that the field of
medicine expects to gain a large benefit from the explosion of wearables and
internet-connected sensors that surround us to acquire and communicate
unprecedented data on symptoms, medication, food intake, and daily-life
activities impacting one's health and wellness. However, IoT-driven healthcare
would have to overcome many barriers, such as: 1) There is an increasing demand
for data storage on cloud servers where the analysis of the medical big data
becomes increasingly complex, 2) The data, when communicated, are vulnerable to
security and privacy issues, 3) The communication of the continuously collected
data is not only costly but also energy hungry, 4) Operating and maintaining
the sensors directly from the cloud servers are non-trial tasks. This book
chapter defined Fog Computing in the context of medical IoT. Conceptually, Fog
Computing is a service-oriented intermediate layer in IoT, providing the
interfaces between the sensors and cloud servers for facilitating connectivity,
data transfer, and queryable local database. The centerpiece of Fog computing
is a low-power, intelligent, wireless, embedded computing node that carries out
signal conditioning and data analytics on raw data collected from wearables or
other medical sensors and offers efficient means to serve telehealth
interventions. We implemented and tested an fog computing system using the
Intel Edison and Raspberry Pi that allows acquisition, computing, storage and
communication of the various medical data such as pathological speech data of
individuals with speech disorders, Phonocardiogram (PCG) signal for heart rate
estimation, and Electrocardiogram (ECG)-based Q, R, S detection.Comment: 29 pages, 30 figures, 5 tables. Keywords: Big Data, Body Area
Network, Body Sensor Network, Edge Computing, Fog Computing, Medical
Cyberphysical Systems, Medical Internet-of-Things, Telecare, Tele-treatment,
Wearable Devices, Chapter in Handbook of Large-Scale Distributed Computing in
Smart Healthcare (2017), Springe
Biosignal Generation and Latent Variable Analysis with Recurrent Generative Adversarial Networks
The effectiveness of biosignal generation and data augmentation with
biosignal generative models based on generative adversarial networks (GANs),
which are a type of deep learning technique, was demonstrated in our previous
paper. GAN-based generative models only learn the projection between a random
distribution as input data and the distribution of training data.Therefore, the
relationship between input and generated data is unclear, and the
characteristics of the data generated from this model cannot be controlled.
This study proposes a method for generating time-series data based on GANs and
explores their ability to generate biosignals with certain classes and
characteristics. Moreover, in the proposed method, latent variables are
analyzed using canonical correlation analysis (CCA) to represent the
relationship between input and generated data as canonical loadings. Using
these loadings, we can control the characteristics of the data generated by the
proposed method. The influence of class labels on generated data is analyzed by
feeding the data interpolated between two class labels into the generator of
the proposed GANs. The CCA of the latent variables is shown to be an effective
method of controlling the generated data characteristics. We are able to model
the distribution of the time-series data without requiring domain-dependent
knowledge using the proposed method. Furthermore, it is possible to control the
characteristics of these data by analyzing the model trained using the proposed
method. To the best of our knowledge, this work is the first to generate
biosignals using GANs while controlling the characteristics of the generated
data
Abnormal ECG search in long-term electrocardiographic recordings from an animal model of heart failure
Heart failure is one of the leading causes of death in the United States. Five million Americans suffer from heart failure. Advances in portable electrocardiogram (ECG) monitoring systems and large data storage space allow the ECG to be recorded continuously for long periods. Long-term monitoring could potentially lead to better diagnosis and treatment if the progression of heart failure could be followed. The challenge is to analyze the sheer mass of data. Manual analysis using the classical methods is impossible. In this dissertation, a framework for analysis of long-term ECG recording and methods for searching an abnormal ECG are presented.;The data used in this research were collected from an animal model of heart failure. Chronic heart failure was gradually induced in rats by aldosterone infusion and a high Na and low Mg diet. The ECG was continuously recorded during the experimental period of 11-12 weeks through radiotelemetry. The ECG leads were placed subcutaneously in lead-II configuration. In the end, there were 80 GB of data from five animals. Besides the massive amount of data, noise and artifacts also caused problems in the analysis.;The framework includes data preparation, ECG beat detection, EMG noise detection, baseline fluctuation removal, ECG template generation, feature extraction, and abnormal ECG search. The raw data was converted from its original format and stored in a database for data retrieval. The beat detection technique was improved from the original algorithm so that it was less sensitive to signal baseline jump and more sensitive to beat size variation. A method for estimating a parameter required for baseline fluctuation removal is proposed. It provides a good result on test signals. A new algorithm for EMG noise detection was developed using morphological filters and moving variance. The resulting sensitivity and specificity are 94% and 100%, respectively. A procedure for ECG template generation was proposed to capture gradual change in ECG morphology and manage the matching process if numerous ECG templates are created. RR intervals and heart rate variability parameters are extracted and plotted to display progressive changes as heart failure develops. In the abnormal ECG search, premature ventricular complexes, elevated ST segment, and split-R-wave ECG are considered. New features are extracted from ECG morphology. The Fisher linear discriminant analysis is used to classify the normal and abnormal ECG. The results provide classification rate, sensitivity, and specificity of 97.35%, 96.02%, and 98.91%, respectively
False alarm reduction in critical care
High false alarm rates in the ICU decrease quality of care by slowing staff response times while increasing patient delirium through noise pollution. The 2015 PhysioNet/Computing in Cardiology Challenge provides a set of 1250 multi-parameter ICU data segments associated with critical arrhythmia alarms, and challenges the general research community to address the issue of false alarm suppression using all available signals. Each data segment was 5 minutes long (for real time analysis), ending at the time of the alarm. For retrospective analysis, we provided a further 30 seconds of data after the alarm was triggered. A total of 750 data segments were made available for training and 500 were held back for testing. Each alarm was reviewed by expert annotators, at least two of whom agreed that the alarm was either true or false. Challenge participants were invited to submit a complete, working algorithm to distinguish true from false alarms, and received a score based on their program's performance on the hidden test set. This score was based on the percentage of alarms correct, but with a penalty that weights the suppression of true alarms five times more heavily than acceptance of false alarms. We provided three example entries based on well-known, open source signal processing algorithms, to serve as a basis for comparison and as a starting point for participants to develop their own code. A total of 38 teams submitted a total of 215 entries in this year's Challenge. This editorial reviews the background issues for this challenge, the design of the challenge itself, the key achievements, and the follow-up research generated as a result of the Challenge, published in the concurrent special issue of Physiological Measurement. Additionally we make some recommendations for future changes in the field of patient monitoring as a result of the Challenge.National Institutes of Health (U.S.) (Grant R01-GM104987)National Institute of General Medical Sciences (U.S.) (Grant U01-EB-008577)National Institutes of Health (U.S.) (Grant R01-EB-001659
ADAPTIVE MODELS-BASED CARDIAC SIGNALS ANALYSIS AND FEATURE EXTRACTION
Signal modeling and feature extraction are among the most crucial and important
steps for stochastic signal processing. In this thesis, a general framework that employs
adaptive model-based recursive Bayesian state estimation for signal processing and
feature extraction is described. As a case study, the proposed framework is studied
for the problem of cardiac signal analysis. The main objective is to improve the signal
processing aspects of cardiac signals by developing new techniques based on adaptive
modelling of electrocardiogram (ECG) wave-forms. Specially several novel and
improved approaches to model-based ECG decomposition, waveform characterization
and feature extraction are proposed and studied in detail. In the concept of ECG decomposition
and wave-forms characterization, the main idea is to extend and improve
the signal dynamical models (i.e. reducing the non-linearity of the state model with
respect to previous solutions) while combining with Kalman smoother to increase the
accuracy of the model in order to split the ECG signal into its waveform components,
as it is proved that Kalman filter/smoother is an optimal estimator in minimum mean
square error (MMSE) for linear dynamical systems. The framework is used for many
real applications, such as: ECG components extraction, ST segment analysis (estimation
of a possible marker of ventricular repolarization known as T/QRS ratio) and
T-wave Alternans (TWA) detection, and its extension to many other applications is
straightforward.
Based on the proposed framework, a novel model to characterization of Atrial Fibrillation
(AF) is presented which is more effective when compared with other methods
proposed with the same aims. In this model, ventricular activity (VA) is represented
by a sum of Gaussian kernels, while a sinusoidal model is employed for atrial activity
(AA). This new model is able to track AA, VA and fibrillatory frequency simultaneously
against other methods which try to analyze the atrial fibrillatory waves (f-waves)
after VA cancellation.
Furthermore we study a new ECG processing method for assessing the spatial dispersion
of ventricular repolarization (SHVR) using V-index and a novel algorithm to
estimate the index is presented, leading to more accurate estimates. The proposed algorithm
was used to study the diagnostic and prognostic value of the V-index in patients
with symptoms suggestive of Acute Myocardial Infraction (AMI)
Morphological Variability Analysis of Physiologic Waveform for Prediction and Detection of Diseases
For many years it has been known that variability of the morphology of high-resolution (∼30-1000 Hz) physiological time series data provides additional prognostic value over lower resolution (≤ 1Hz) derived averages such as heart rate (HR), breathing rate (BR) and blood pressure (BP). However, the field has remained rather ad hoc, based on hand-crafted features.
Using a model-based approach we explore the nature of these features and their sensitivity to variabilities introduced by changes in both the sampling period (HR) and observational reference frame (through breathing). HR and BR are determined as having a statistically significant confounding effect on the morphological variability (MV) evaluated in high-resolution physiological time series data, thus an important gap is identified in previous studies that ignored the effects of HR and BR when measuring MV. We build a best-in-class open-source toolbox for exploring MV that accounts for the confounding factors of HR and BR. We demonstrate the toolbox’s utility in three domains on three different signals: arterial BP in sepsis; photoplethysmogram in coarctation of the aorta; and electrocardiogram (ECG) in post-traumatic stress disorder (PTSD). In each of the three case studies, incorporating features that capture MV while controlling for BR and/or HR improved disease classification performance compared to previously established methods that used features from lower resolution time series data.
Using the PTSD example, we then introduce a deep learning approach that significantly improves our ability to identify the effects of PTSD on ECG morphology. In particular, we show that pre-training the algorithm on a database of over 70,000 ECGs containing a set of 25 rhythms, allowed us to boost performance from an area under the receiver operating characteristic curve (AUROC) of 0.61 to 0.85. This novel approach to identifying morphology indicates that there is much more to morphological variability during stressful PTSD-related events than the simple periodic modulation of the T-wave amplitude. This research indicates that future work should focus on identifying the etiology of the dynamic features in the ECG that provided such a large boost in performance, since this may reveal
novel underlying mechanisms of the influence of PTSD on the myocardium.Ph.D
- …