16 research outputs found

    A 12-lead Electrocardiogram Database for Arrhythmia Research Covering More Than 10,000 Patients

    Get PDF
    This newly inaugurated research database for 12-lead electrocardiogram signals was created under the auspices of Chapman University and Shaoxing People’s Hospital (Shaoxing Hospital Zhejiang University School of Medicine) and aims to enable the scientific community in conducting new studies on arrhythmia and other cardiovascular conditions. Certain types of arrhythmias, such as atrial fibrillation, have a pronounced negative impact on public health, quality of life, and medical expenditures. As a non-invasive test, long term ECG monitoring is a major and vital diagnostic tool for detecting these conditions. This practice, however, generates large amounts of data, the analysis of which requires considerable time and effort by human experts. Advancement of modern machine learning and statistical tools can be trained on high quality, large data to achieve exceptional levels of automated diagnostic accuracy. Thus, we collected and disseminated this novel database that contains 12-lead ECGs of 10,646 patients with a 500 Hz sampling rate that features 11 common rhythms and 67 additional cardiovascular conditions, all labeled by professional experts. The dataset consists of 10-second, 12-dimension ECGs and labels for rhythms and other conditions for each subject. The dataset can be used to design, compare, and fine-tune new and classical statistical and machine learning techniques in studies focused on arrhythmia and other cardiovascular conditions

    Optimal Multi-Stage Arrhythmia Classification Approach

    Get PDF
    Arrhythmia constitutes a problem with the rate or rhythm of the heartbeat, and an early diagnosis is essential for the timely inception of successful treatment. We have jointly optimized the entire multi-stage arrhythmia classification scheme based on 12-lead surface ECGs that attains the accuracy performance level of professional cardiologists. The new approach is comprised of a three-step noise reduction stage, a novel feature extraction method and an optimal classification model with finely tuned hyperparameters. We carried out an exhaustive study comparing thousands of competing classification algorithms that were trained on our proprietary, large and expertly labeled dataset consisting of 12-lead ECGs from 40,258 patients with four arrhythmia classes: atrial fibrillation, general supraventricular tachycardia, sinus bradycardia and sinus rhythm including sinus irregularity rhythm. Our results show that the optimal approach consisted of Low Band Pass filter, Robust LOESS, Non Local Means smoothing, a proprietary feature extraction method based on percentiles of the empirical distribution of ratios of interval lengths and magnitudes of peaks and valleys, and Extreme Gradient Boosting Tree classifier, achieved an F1-Score of 0.988 on patients without additional cardiac conditions. The same noise reduction and feature extraction methods combined with Gradient Boosting Tree classifier achieved an F1-Score of 0.97 on patients with additional cardiac conditions. Our method achieved the highest classification accuracy (average 10-fold cross-validation F1-Score of 0.992) using an external validation data, MIT-BIH arrhythmia database. The proposed optimal multi-stage arrhythmia classification approach can dramatically benefit automatic ECG data analysis by providing cardiologist level accuracy and robust compatibility with various ECG data sources

    An improved extreme-point symmetric mode decomposition method and its application to rolling bearing fault diagnosis

    Get PDF
    HHT (Hilbert-Huang Transform) which consist of EMD (Empirical Mode Decomposition) and HT (Hilbert Transform) now is the most widely used time-frequency analysis technique for rolling element bearing fault diagnosis, however, its fault characteristic information extraction accuracy is usually limited due to the problem of mode mixing in EMD. ESMD (Extreme-point symmetric mode decomposition) is a novel development of HHT which is promising to alleviate this limitation and it has been applied to some fields successfully, but its application for rolling bearing fault diagnosis has rarely been seen in the literature. In this paper, ESMD is applied to extract the bearing fault characteristics for rolling bearing fault detection, and the results proved that ESMD can have a better fault diagnose effect than EMD and HT. What’s more, for further improving bearing fault characteristic extraction accuracy of rolling bearing vibration signals, the sifting scheme is proposed for selecting the sensitive fault-related IMFs (intrinsic mode functions) generated by ESMD, in which a weighted kurtosis index is introduced for automatic selection and reconstruction of the fault-related IMFs, and then the original and reconstructed bearing fault vibration signal after performing Hilbert transform as the results to diagnose the incipient rolling bearing fault. ESMD combined with the proposed sifting scheme are applied to diagnose the simulated and experimental signals, and the results confirmed that the sifting scheme based ESMD is superior to the other conventional method in rolling bearings fault diagnosis

    Biomedical Signal and Image Processing

    Get PDF
    Written for senior-level and first year graduate students in biomedical signal and image processing, this book describes fundamental signal and image processing techniques that are used to process biomedical information. The book also discusses application of these techniques in the processing of some of the main biomedical signals and images, such as EEG, ECG, MRI, and CT. New features of this edition include the technical updating of each chapter along with the addition of many more examples, the majority of which are MATLAB based

    A Better Looking Brain: Image Pre-Processing Approaches for fMRI Data

    Get PDF
    Researchers in the field of functional neuroimaging have faced a long standing problem in pre-processing low spatial resolution data without losing meaningful details within. Commonly, the brain function is recorded by a technique known as echo-planar imaging that represents the measure of blood flow (BOLD signal) through a particular location in the brain as an array of intensity values changing over time. This approach to record a movie of blood flow in the brain is known as fMRI. The neural activity is then studied from the temporal correlation patterns existing within the fMRI time series. However, the resulting images are noisy and contain low spatial detail, thus making it imperative to pre-process them appropriately to derive meaningful activation patterns. Two of the several standard preprocessing steps employed just before the analysis stage are denoising and normalization. Fundamentally, it is difficult to perfectly remove noise from an image without making assumptions about signal and noise distributions. A convenient and commonly used alternative is to smooth the image with a Gaussian filter, but this method suffers from various obvious drawbacks, primarily loss of spatial detail. A greater challenge arises when we attempt to derive average activation patterns from fMRI images acquired from a group of individuals. The brain of one individual differs from others in a structural sense as well as in a functional sense. Commonly, the inter-individual differences in anatomical structures are compensated for by co-registering each subject\u27s data to a common normalization space, known as spatial normalization. However, there are no existing methods to compensate for the differences in functional organization of the brain. This work presents first steps towards data-driven robust algorithms for fMRI image denoising and multi-subject image normalization by utilizing inherent information within fMRI data. In addition, a new validation approach based on spatial shape of the activation regions is presented to quantify the effects of preprocessing and also as a tool to record the differences in activation patterns between individual subjects or within two groups such as healthy controls and patients with mental illness. Qualititative and quantitative results of the proposed framework compare favorably against existing and widely used model-driven approaches such as Gaussian smoothing and structure-based spatial normalization. This work is intended to provide neuroscience researchers tools to derive more meaningful activation patterns to accurately identify imaging biomarkers for various neurodevelopmental diseases and also maximize the specificity of a diagnosis

    Intelligent Biosignal Analysis Methods

    Get PDF
    This book describes recent efforts in improving intelligent systems for automatic biosignal analysis. It focuses on machine learning and deep learning methods used for classification of different organism states and disorders based on biomedical signals such as EEG, ECG, HRV, and others

    Echocardiography

    Get PDF
    The book "Echocardiography - New Techniques" brings worldwide contributions from highly acclaimed clinical and imaging science investigators, and representatives from academic medical centers. Each chapter is designed and written to be accessible to those with a basic knowledge of echocardiography. Additionally, the chapters are meant to be stimulating and educational to the experts and investigators in the field of echocardiography. This book is aimed primarily at cardiology fellows on their basic echocardiography rotation, fellows in general internal medicine, radiology and emergency medicine, and experts in the arena of echocardiography. Over the last few decades, the rate of technological advancements has developed dramatically, resulting in new techniques and improved echocardiographic imaging. The authors of this book focused on presenting the most advanced techniques useful in today's research and in daily clinical practice. These advanced techniques are utilized in the detection of different cardiac pathologies in patients, in contributing to their clinical decision, as well as follow-up and outcome predictions. In addition to the advanced techniques covered, this book expounds upon several special pathologies with respect to the functions of echocardiography

    Electrocardiogram Signal Denoising Using Extreme-Point Symmetric Mode Decomposition and Nonlocal Means

    No full text
    Electrocardiogram (ECG) signals contain a great deal of essential information which can be utilized by physicians for the diagnosis of heart diseases. Unfortunately, ECG signals are inevitably corrupted by noise which will severely affect the accuracy of cardiovascular disease diagnosis. Existing ECG signal denoising methods based on wavelet shrinkage, empirical mode decomposition and nonlocal means (NLM) cannot provide sufficient noise reduction or well-detailed preservation, especially with high noise corruption. To address this problem, we have proposed a hybrid ECG signal denoising scheme by combining extreme-point symmetric mode decomposition (ESMD) with NLM. In the proposed method, the noisy ECG signals will first be decomposed into several intrinsic mode functions (IMFs) and adaptive global mean using ESMD. Then, the first several IMFs will be filtered by the NLM method according to the frequency of IMFs while the QRS complex detected from these IMFs as the dominant feature of the ECG signal and the remaining IMFs will be left unprocessed. The denoised IMFs and unprocessed IMFs are combined to produce the final denoised ECG signals. Experiments on both simulated ECG signals and real ECG signals from the MIT-BIH database demonstrate that the proposed method can suppress noise in ECG signals effectively while preserving the details very well, and it outperforms several state-of-the-art ECG signal denoising methods in terms of signal-to-noise ratio (SNR), root mean squared error (RMSE), percent root mean square difference (PRD) and mean opinion score (MOS) error index
    corecore