2,225 research outputs found

    Qrs detection based on medical knowledge and cascades of moving average filters

    Get PDF
    Heartbeat detection is the first step in automatic analysis of the electrocardiogram (ECG). For mobile and wearable devices, the detection process should be both accurate and computationally efficient. In this paper, we present a QRS detection algorithm based on moving average filters, which affords a simple yet robust signal processing technique. The decision logic considers the rhythmic and morphological features of the QRS complex. QRS enhancing is performed with channel-specific moving average cascades selected from a pool of derivative systems we designed. We measured the effectiveness of our algorithm on well-known benchmark databases, reporting F1 scores, sensitivity on abnormal beats and processing time. We also evaluated the performances of other available detectors for a direct comparison with the same criteria. The algorithm we propose achieved satisfying performances on par with or higher than the other QRS detectors. Despite the performances we report are not the highest that have been published so far, our approach to QRS detection enhances computational efficiency while maintaining high accuracy

    The hidden waves in the ECG uncovered: a sound automated interpretation method

    Full text link
    A novel approach for analysing cardiac rhythm data is presented in this paper. Heartbeats are decomposed into the five fundamental PP, QQ, RR, SS and TT waves plus an error term to account for artefacts in the data which provides a meaningful, physical interpretation of the heart's electric system. The morphology of each wave is concisely described using four parameters that allow to all the different patterns in heartbeats be characterized and thus differentiated This multi-purpose approach solves such questions as the extraction of interpretable features, the detection of the fiducial marks of the fundamental waves, or the generation of synthetic data and the denoising of signals. Yet, the greatest benefit from this new discovery will be the automatic diagnosis of heart anomalies as well as other clinical uses with great advantages compared to the rigid, vulnerable and black box machine learning procedures, widely used in medical devices. The paper shows the enormous potential of the method in practice; specifically, the capability to discriminate subjects, characterize morphologies and detect the fiducial marks (reference points) are validated numerically using simulated and real data, thus proving that it outperforms its competitors

    FMM-Head: Enhancing Autoencoder-based ECG anomaly detection with prior knowledge

    Full text link
    Detecting anomalies in electrocardiogram data is crucial to identifying deviations from normal heartbeat patterns and providing timely intervention to at-risk patients. Various AutoEncoder models (AE) have been proposed to tackle the anomaly detection task with ML. However, these models do not consider the specific patterns of ECG leads and are unexplainable black boxes. In contrast, we replace the decoding part of the AE with a reconstruction head (namely, FMM-Head) based on prior knowledge of the ECG shape. Our model consistently achieves higher anomaly detection capabilities than state-of-the-art models, up to 0.31 increase in area under the ROC curve (AUROC), with as little as half the original model size and explainable extracted features. The processing time of our model is four orders of magnitude lower than solving an optimization problem to obtain the same parameters, thus making it suitable for real-time ECG parameters extraction and anomaly detection.Comment: 23 pages, 14 figure

    Novel hybrid extraction systems for fetal heart rate variability monitoring based on non-invasive fetal electrocardiogram

    Get PDF
    This study focuses on the design, implementation and subsequent verification of a new type of hybrid extraction system for noninvasive fetal electrocardiogram (NI-fECG) processing. The system designed combines the advantages of individual adaptive and non-adaptive algorithms. The pilot study reviews two innovative hybrid systems called ICA-ANFIS-WT and ICA-RLS-WT. This is a combination of independent component analysis (ICA), adaptive neuro-fuzzy inference system (ANFIS) algorithm or recursive least squares (RLS) algorithm and wavelet transform (WT) algorithm. The study was conducted on clinical practice data (extended ADFECGDB database and Physionet Challenge 2013 database) from the perspective of non-invasive fetal heart rate variability monitoring based on the determination of the overall probability of correct detection (ACC), sensitivity (SE), positive predictive value (PPV) and harmonic mean between SE and PPV (F1). System functionality was verified against a relevant reference obtained by an invasive way using a scalp electrode (ADFECGDB database), or relevant reference obtained by annotations (Physionet Challenge 2013 database). The study showed that ICA-RLS-WT hybrid system achieve better results than ICA-ANFIS-WT. During experiment on ADFECGDB database, the ICA-RLS-WT hybrid system reached ACC > 80 % on 9 recordings out of 12 and the ICA-ANFIS-WT hybrid system reached ACC > 80 % only on 6 recordings out of 12. During experiment on Physionet Challenge 2013 database the ICA-RLS-WT hybrid system reached ACC > 80 % on 13 recordings out of 25 and the ICA-ANFIS-WT hybrid system reached ACC > 80 % only on 7 recordings out of 25. Both hybrid systems achieve provably better results than the individual algorithms tested in previous studies.Web of Science713178413175

    Multiscale Cohort Modeling of Atrial Electrophysiology : Risk Stratification for Atrial Fibrillation through Machine Learning on Electrocardiograms

    Get PDF
    Patienten mit Vorhofflimmern sind einem fünffach erhöhten Risiko für einen ischämischen Schlaganfall ausgesetzt. Eine frühzeitige Erkennung und Diagnose der Arrhythmie würde ein rechtzeitiges Eingreifen ermöglichen, um möglicherweise auftretende Begleiterkrankungen zu verhindern. Eine Vergrößerung des linken Vorhofs sowie fibrotisches Vorhofgewebe sind Risikomarker für Vorhofflimmern, da sie die notwendigen Voraussetzungen für die Aufrechterhaltung der chaotischen elektrischen Depolarisation im Vorhof erfüllen. Mithilfe von Techniken des maschinellen Lernens könnten Fibrose und eine Vergrößerung des linken Vorhofs basierend auf P Wellen des 12-Kanal Elektrokardiogramms im Sinusrhythmus automatisiert identifiziert werden. Dies könnte die Basis für eine nicht-invasive Risikostrat- ifizierung neu auftretender Vorhofflimmerepisoden bilden, um anfällige Patienten für ein präventives Screening auszuwählen. Zu diesem Zweck wurde untersucht, ob simulierte Vorhof-Elektrokardiogrammdaten, die dem klinischen Trainingssatz eines maschinellen Lernmodells hinzugefügt wurden, zu einer verbesserten Klassifizierung der oben genannten Krankheiten bei klinischen Daten beitra- gen könnten. Zwei virtuelle Kohorten, die durch anatomische und funktionelle Variabilität gekennzeichnet sind, wurden generiert und dienten als Grundlage für die Simulation großer P Wellen-Datensätze mit genau bestimmbaren Annotationen der zugrunde liegenden Patholo- gie. Auf diese Weise erfüllen die simulierten Daten die notwendigen Voraussetzungen für die Entwicklung eines Algorithmus für maschinelles Lernen, was sie von klinischen Daten unterscheidet, die normalerweise nicht in großer Zahl und in gleichmäßig verteilten Klassen vorliegen und deren Annotationen möglicherweise durch unzureichende Expertenannotierung beeinträchtigt sind. Für die Schätzung des Volumenanteils von linksatrialem fibrotischen Gewebe wurde ein merkmalsbasiertes neuronales Netz entwickelt. Im Vergleich zum Training des Modells mit nur klinischen Daten, führte das Training mit einem hybriden Datensatz zu einer Reduzierung des Fehlers von durchschnittlich 17,5 % fibrotischem Volumen auf 16,5 %, ausgewertet auf einem rein klinischen Testsatz. Ein Long Short-Term Memory Netzwerk, das für die Unterscheidung zwischen gesunden und P Wellen von vergrößerten linken Vorhöfen entwickelt wurde, lieferte eine Genauigkeit von 0,95 wenn es auf einem hybriden Datensatz trainiert wurde, von 0,91 wenn es nur auf klinischen Daten trainiert wurde, die alle mit 100 % Sicherheit annotiert wurden, und von 0,83 wenn es auf einem klinischen Datensatz trainiert wurde, der alle Signale unabhängig von der Sicherheit der Expertenannotation enthielt. In Anbetracht der Ergebnisse dieser Arbeit können Elektrokardiogrammdaten, die aus elektrophysiologischer Modellierung und Simulationen an virtuellen Patientenkohorten resul- tieren und relevante Variabilitätsaspekte abdecken, die mit realen Beobachtungen übereinstim- men, eine wertvolle Datenquelle zur Verbesserung der automatisierten Risikostratifizierung von Vorhofflimmern sein. Auf diese Weise kann den Nachteilen klinischer Datensätze für die Entwicklung von Modellen des maschinellen Lernens entgegengewirkt werden. Dies trägt letztendlich zu einer frühzeitigen Erkennung der Arrhythmie bei, was eine rechtzeitige Auswahl geeigneter Behandlungsstrategien ermöglicht und somit das Schlaganfallrisiko der betroffenen Patienten verringert

    A Cluster-Based Opposition Differential Evolution Algorithm Boosted by a Local Search for ECG Signal Classification

    Full text link
    Electrocardiogram (ECG) signals, which capture the heart's electrical activity, are used to diagnose and monitor cardiac problems. The accurate classification of ECG signals, particularly for distinguishing among various types of arrhythmias and myocardial infarctions, is crucial for the early detection and treatment of heart-related diseases. This paper proposes a novel approach based on an improved differential evolution (DE) algorithm for ECG signal classification for enhancing the performance. In the initial stages of our approach, the preprocessing step is followed by the extraction of several significant features from the ECG signals. These extracted features are then provided as inputs to an enhanced multi-layer perceptron (MLP). While MLPs are still widely used for ECG signal classification, using gradient-based training methods, the most widely used algorithm for the training process, has significant disadvantages, such as the possibility of being stuck in local optimums. This paper employs an enhanced differential evolution (DE) algorithm for the training process as one of the most effective population-based algorithms. To this end, we improved DE based on a clustering-based strategy, opposition-based learning, and a local search. Clustering-based strategies can act as crossover operators, while the goal of the opposition operator is to improve the exploration of the DE algorithm. The weights and biases found by the improved DE algorithm are then fed into six gradient-based local search algorithms. In other words, the weights found by the DE are employed as an initialization point. Therefore, we introduced six different algorithms for the training process (in terms of different local search algorithms). In an extensive set of experiments, we showed that our proposed training algorithm could provide better results than the conventional training algorithms.Comment: 44 pages, 9 figure

    Neural architecture search for 1D CNNs - Different approaches tests and measurements

    Get PDF
    In the field of sensors, in areas such as industrial, clinical, or environment, it is common to find one dimensional (1D) formatted data (e.g., electrocardiogram, temperature, power consumption). A very promising technique for modelling this information is the use of One Dimensional Convolutional Neural Networks (1D CNN), which introduces a new challenge, namely how to define the best architecture for a 1D CNN. This manuscript addresses the concept of One Dimensional Neural Architecture Search (1D NAS), an approach that automates the search for the best combination of Neuronal Networks hyperparameters (model architecture), including both structural and training hyperparameters, for optimising 1D CNNs. This work includes the implementation of search processes for 1D CNN architectures based on five strategies: greedy, random, Bayesian, hyperband, and genetic approaches to perform, collect, and analyse the results obtained by each strategy scenario. For the analysis, we conducted 125 experiments, followed by a thorough evaluation from multiple perspectives, including the best-performing model in terms of accuracy, consistency, variability, total running time, and computational resource consumption. Finally, by presenting the optimised 1D CNN architecture, the results for the manuscript’s research question (a real-life clinical case) were provided.info:eu-repo/semantics/publishedVersio

    Combining Synthesis of Cardiorespiratory Signals and Artifacts with Deep Learning for Robust Vital Sign Estimation

    Get PDF
    Healthcare has been remarkably morphing on the account of Big Data. As Machine Learning (ML) consolidates its place in simpler clinical chores, more complex Deep Learning (DL) algorithms have struggled to keep up, despite their superior capabilities. This is mainly attributed to the need for large amounts of data for training, which the scientific community is unable to satisfy. The number of promising DL algorithms is considerable, although solutions directly targeting the shortage of data lack. Currently, dynamical generative models are the best bet, but focus on single, classical modalities and tend to complicate significantly with the amount of physiological effects they can simulate. This thesis aims at providing and validating a framework, specifically addressing the data deficit in the scope of cardiorespiratory signals. Firstly, a multimodal statistical synthesizer was designed to generate large, annotated artificial signals. By expressing data through coefficients of pre-defined, fitted functions and describing their dependence with Gaussian copulas, inter- and intra-modality associations were learned. Thereafter, new coefficients are sampled to generate artificial, multimodal signals with the original physiological dynamics. Moreover, normal and pathological beats along with artifacts were included by employing Markov models. Secondly, a convolutional neural network (CNN) was conceived with a novel sensor-fusion architecture and trained with synthesized data under real-world experimental conditions to evaluate how its performance is affected. Both the synthesizer and the CNN not only performed at state of the art level but also innovated with multiple types of generated data and detection error improvements, respectively. Cardiorespiratory data augmentation corrected performance drops when not enough data is available, enhanced the CNN’s ability to perform on noisy signals and to carry out new tasks when introduced to, otherwise unavailable, types of data. Ultimately, the framework was successfully validated showing potential to leverage future DL research on Cardiology into clinical standards

    Electrocardiogram pattern recognition and analysis based on artificial neural networks and support vector machines: a review.

    Get PDF
    Computer systems for Electrocardiogram (ECG) analysis support the clinician in tedious tasks (e.g., Holter ECG monitored in Intensive Care Units) or in prompt detection of dangerous events (e.g., ventricular fibrillation). Together with clinical applications (arrhythmia detection and heart rate variability analysis), ECG is currently being investigated in biometrics (human identification), an emerging area receiving increasing attention. Methodologies for clinical applications can have both differences and similarities with respect to biometrics. This paper reviews methods of ECG processing from a pattern recognition perspective. In particular, we focus on features commonly used for heartbeat classification. Considering the vast literature in the field and the limited space of this review, we dedicated a detailed discussion only to a few classifiers (Artificial Neural Networks and Support Vector Machines) because of their popularity; however, other techniques such as Hidden Markov Models and Kalman Filtering will be also mentioned

    An efficient instantaneous ecg delineation algorithm

    Get PDF
    An efficient electrocardiogram (ECG) delineation algorithm is proposed to instantaneously delineate the ECG characteristic points, such as peak, onset and offset points of QRS, P and T waves. It is essential to delineate the ECG characteristic waves accurately and precisely as it ensure the performance of ECG analysis and diagnosis. The proposed delineation algorithm is based on discrete wavelet transform (DWT) and moving window average (MWA) techniques. The proposed delineation algorithm is evaluated and assessed with the annotation data of QT database in term of accuracy, sensitivity and positive predictive value. With the only available 13 sets QT database records with modified Lead II data, the proposed algorithm achieved significant P peak, R peak, T peak and T offset delineation performance with the accuracy of 95.34%, 99.80%, 90.82% and 86.33% respectively when evaluated with q1c annotation file. The mean difference between detected and annotated T offset based on q1c and q2c is 13 ms and 3.6 ms respectively. The delineation of 15 minute-long ECG record only required 74.702 second. As conclusion, the proposed ECG delineation algorithm based on DWT and MWA techniques have been proven simple, efficient and accurate in delineating the significant ECG characteristic points
    corecore