743 research outputs found
Deep Learning in Cardiology
The medical field is creating large amount of data that physicians are unable
to decipher and use efficiently. Moreover, rule-based expert systems are
inefficient in solving complicated medical tasks or for creating insights using
big data. Deep learning has emerged as a more accurate and effective technology
in a wide range of medical problems such as diagnosis, prediction and
intervention. Deep learning is a representation learning method that consists
of layers that transform the data non-linearly, thus, revealing hierarchical
relationships and structures. In this review we survey deep learning
application papers that use structured data, signal and imaging modalities from
cardiology. We discuss the advantages and limitations of applying deep learning
in cardiology that also apply in medicine in general, while proposing certain
directions as the most viable for clinical use.Comment: 27 pages, 2 figures, 10 table
Recommended from our members
Systems and methods for physiological signal enhancement and biometric extraction using non-invasive optical sensors
A system and method for signal processing to remove unwanted noise components including: (i) wavelength-independent motion artifacts such as tissue, bone and skin effects, and (ii) wavelength-dependent motion artifact/noise components such as venous blood pulsation and movement due to various sources including muscle pump, respiratory pump and physical perturbation. Disclosed are methods, analytics, and their uses for reliable perfusion monitoring, arterial oxygen saturation monitoring, heart rate monitoring during daily activities and in hospital settings and for extraction of physiological parameters such as respiration information, hemodynamic parameters, venous capacity, and fluid responsiveness. The system and methods disclosed are extendable to include monitoring platforms for perfusion, hypoxia, arrhythmia detection, airway obstruction detection and sleep disorders including apnea.Board of Regents, University of Texas Syste
Real Time QRS Detection Based on M-ary Likelihood Ratio Test on the DFT Coefficients
This paper shows an adaptive statistical test for QRS detection of electrocardiography (ECG) signals. The method is based on a M-ary generalized likelihood ratio test (LRT) defined over a multiple observation window in the Fourier domain. The motivations for proposing another detection algorithm based on maximum a posteriori (MAP) estimation are found in the high complexity of the signal model proposed in previous approaches which i) makes them computationally unfeasible or not intended for real time applications such as intensive care monitoring and (ii) in which the parameter selection conditions the overall performance. In this sense, we propose an alternative model based on the independent Gaussian properties of the Discrete Fourier Transform (DFT) coefficients, which allows to define a simplified MAP probability function. In addition, the proposed approach defines an adaptive MAP statistical test in which a global hypothesis is defined on particular hypotheses of the multiple observation window. In this sense, the observation interval is modeled as a discontinuous transmission discrete-time stochastic process avoiding the inclusion of parameters that constraint the morphology of the QRS complexes.This work has received research funding from the Spanish government (www.micinn.es) under project TEC2012 34306 (DiagnoSIS, Diagnosis by means of Statistical Intelligent Systems, 70KâŹ) and projects P09-TIC-4530 (300KâŹ) and P11-TIC-7103 (156KâŹ) from the Andalusian government (http://www.juntadeandalucia.es/organismoâs/economiainnovacioncienciayempleo.html)
A Real-Time Automated Point-Process Method for the Detection and Correction of Erroneous and Ectopic Heartbeats
The presence of recurring arrhythmic events (also known as cardiac dysrhythmia or irregular heartbeats), as well as erroneous beat detection due to low signal quality, significantly affects estimation of both time and frequency domain indices of heart rate variability (HRV). A reliable, real-time classification and correction of ECG-derived heartbeats is a necessary prerequisite for an accurate online monitoring of HRV and cardiovascular control. We have developed a novel point-process-based method for real-time R-R interval error detection and correction. Given an R-wave event, we assume that the length of the next R-R interval follows a physiologically motivated, time-varying inverse Gaussian probability distribution. We then devise an instantaneous automated detection and correction procedure for erroneous and arrhythmic beats by using the information on the probability of occurrence of the observed beat provided by the model. We test our algorithm over two datasets from the PhysioNet archive. The Fantasia normal rhythm database is artificially corrupted with known erroneous beats to test both the detection procedure and correction procedure. The benchmark MIT-BIH Arrhythmia database is further considered to test the detection procedure of real arrhythmic events and compare it with results from previously published algorithms. Our automated algorithm represents an improvement over previous procedures, with best specificity for the detection of correct beats, as well as highest sensitivity to missed and extra beats, artificially misplaced beats, and for real arrhythmic events. A near-optimal heartbeat classification and correction, together with the ability to adapt to time-varying changes of heartbeat dynamics in an online fashion, may provide a solid base for building a more reliable real-time HRV monitoring device. © 1964-2012 IEEE
Support vector method for robust ARMA system identification
This paper presents a new approach to auto-regressive and moving average (ARMA) modeling based on the support vector method (SVM) for identification applications. A statistical analysis of the characteristics of the proposed method is carried out. An analytical relationship between residuals andSVM-ARMA coefficients allows the linking of the fundamentals of SVM with several classical system identification methods. Additionally, the effect of outliers can be cancelled. Application examples show the performance of SVM-ARMA algorithm when it is compared with other system identification methods.Publicad
Blind Source Separation for the Processing of Contact-Less Biosignals
(Spatio-temporale) Blind Source Separation (BSS) eignet sich fĂŒr die Verarbeitung von Multikanal-Messungen im Bereich der kontaktlosen Biosignalerfassung. Ziel der BSS ist dabei die Trennung von (z.B. kardialen) Nutzsignalen und Störsignalen typisch fĂŒr die kontaktlosen Messtechniken. Das Potential der BSS kann praktisch nur ausgeschöpft werden, wenn (1) ein geeignetes BSS-Modell verwendet wird, welches der KomplexitĂ€t der Multikanal-Messung gerecht wird und (2) die unbestimmte Permutation unter den BSS-Ausgangssignalen gelöst wird, d.h. das Nutzsignal praktisch automatisiert identifiziert werden kann. Die vorliegende Arbeit entwirft ein Framework, mit dessen Hilfe die Effizienz von BSS-Algorithmen im Kontext des kamera-basierten Photoplethysmogramms bewertet werden kann. Empfehlungen zur Auswahl bestimmter Algorithmen im Zusammenhang mit spezifischen Signal-Charakteristiken werden abgeleitet. AuĂerdem werden im Rahmen der Arbeit Konzepte fĂŒr die automatisierte Kanalauswahl nach BSS im Bereich der kontaktlosen Messung des Elektrokardiogramms entwickelt und bewertet. Neuartige Algorithmen basierend auf Sparse Coding erwiesen sich dabei als besonders effizient im Vergleich zu Standard-Methoden.(Spatio-temporal) Blind Source Separation (BSS) provides a large potential to process distorted multichannel biosignal measurements in the context of novel contact-less recording techniques for separating distortions from the cardiac signal of interest. This potential can only be practically utilized (1) if a BSS model is applied that matches the complexity of the measurement, i.e. the signal mixture and (2) if permutation indeterminacy is solved among the BSS output components, i.e the component of interest can be practically selected. The present work, first, designs a framework to assess the efficacy of BSS algorithms in the context of the camera-based photoplethysmogram (cbPPG) and characterizes multiple BSS algorithms, accordingly. Algorithm selection recommendations for certain mixture characteristics are derived. Second, the present work develops and evaluates concepts to solve permutation indeterminacy for BSS outputs of contact-less electrocardiogram (ECG) recordings. The novel approach based on sparse coding is shown to outperform the existing concepts of higher order moments and frequency-domain features
Wearable Technologies and AI at the Far Edge for Chronic Heart Failure Prevention and Management: A Systematic Review and Prospects
Smart wearable devices enable personalized at-home healthcare by unobtrusively collecting patient health data and facilitating the development of intelligent platforms to support patient care and management. The accurate analysis of data obtained from wearable devices is crucial for interpreting and contextualizing health data and facilitating the reliable diagnosis and management of critical and chronic diseases. The combination of edge computing and artificial intelligence has provided real-time, time-critical, and privacy-preserving data analysis solutions. However, based on the envisioned service, evaluating the additive value of edge intelligence to the overall architecture is essential before implementation. This article aims to comprehensively analyze the current state of the art on smart health infrastructures implementing wearable and AI technologies at the far edge to support patients with chronic heart failure (CHF). In particular, we highlight the contribution of edge intelligence in supporting the integration of wearable devices into IoT-aware technology infrastructures that provide services for patient diagnosis and management. We also offer an in-depth analysis of open challenges and provide potential solutions to facilitate the integration of wearable devices with edge AI solutions to provide innovative technological infrastructures and interactive services for patients and doctors
Wavelet diagnosis of ECG signals with kaiser based noise diminution
The evaluation of distortion diagnosis using Wavelet function for Electrocardiogram (ECG), Electroen- cephalogram (EEG) and Phonocardiography (PCG) is not novel. However, some of the technological and economic issues remain challenging. The work in this paper is focusing on the reduction of the noise inter- ferences and analyzes different kinds of ECG signals. Furthermore, a physiological monitoring system with a programming model for the filtration of ECG is presented. Kaiser based Finite Impulse Response (FIR) filter is used for noise reduction and identifica- tion of R peaks based on Peak Detection Algorithm (PDA). Two approaches are implemented for detect- ing the R peaks; Amplitude Threshold Value (ATV) and Peak Prediction Technique (PPT). Daubechies wavelet transform is applied to analyze the ECG of driver under stress, arrhythmia and sudden cardiac arrest signals. From the obtained results, it was found that the PPT is an effective and efficient technique in detecting the R peaks compared to ATV
- âŠ