230 research outputs found

    Optimal ECG Signal Denoising Using DWT with Enhanced African Vulture Optimization

    Get PDF
    Cardiovascular diseases (CVDs) are the world's leading cause of death; therefore cardiac health of the human heart has been a fascinating topic for decades. The electrocardiogram (ECG) signal is a comprehensive non-invasive method for determining cardiac health. Various health practitioners use the ECG signal to ascertain critical information about the human heart. In this paper, the noisy ECG signal is denoised based on Discrete Wavelet Transform (DWT) optimized with the Enhanced African Vulture Optimization (AVO) algorithm and adaptive switching mean filter (ASMF) is proposed. Initially, the input ECG signals are obtained from the MIT-BIH ARR dataset and white Gaussian noise is added to the obtained ECG signals. Then the corrupted ECG signals are denoised using Discrete Wavelet Transform (DWT) in which the threshold is optimized with an Enhanced African Vulture Optimization (AVO) algorithm to obtain the optimum threshold. The AVO algorithm is enhanced by Whale Optimization Algorithm (WOA). Additionally, ASMF is tuned by the Enhanced AVO algorithm. The experiments are conducted on the MIT-BIH dataset and the proposed filter built using the EAVO algorithm, attains a significant enhancement in reliable parameters, according to the testing results in terms of SNR, mean difference (MD), mean square error (MSE), normalized root mean squared error (NRMSE), peak reconstruction error (PRE), maximum error (ME), and normalized root mean error (NRME) with existing algorithms namely, PSO, AOA, MVO, etc

    Denoising ECG Signal Using DWT with EAVO

    Get PDF
    Cardiovascular diseases are the leading cause of death across the world, and traditional methods for determining cardiac health are highly invasive and expensive. Detecting CVDs early is critical for effective treatment, yet traditional detection methods lack accessibility, accuracy, and cost-effectiveness – leaving patients with little hope of taking control of their own cardiac health. Noisy ECG signals make it difficult for health practitioners to accurately read and determine heart health. Unreliable readings can lead to misdiagnosis and needless expense. Despite the importance of ECG analysis, traditional methods of signal denoising are inefficient and can produce inaccurate results. This means that medical practitioners are struggling to obtain reliable readings, leaving them unable to accurately treat their patients and leading to a lack of confidence in the medical field. The Enhanced African Vulture Optimization (AVO) algorithm with Discrete Wavelet Transform (DWT) optimized by adaptive switching mean filtration (SMF) is proven to provide accurate denoising of the ECG signal. With this reliable method, medical professionals can quickly and accurately diagnose patients. Obtaining accurate ECG signals and interpreting them quickly is a challenge for healthcare professionals. Not only it takes a lot of time and skill but also requires specialized software to interpret the signals accurately. Healthcare professionals are facing a serious challenge when it comes to obtaining accurate ECG signals and interpreting them quickly. It requires them to spend extra time and effort, as well as specialize in the field with expensive software. Time is of the essence in healthcare and ECG readings can mean the difference between life and death. Specialized software can be expensive and time-consuming for those who don't have the resources or expertise. Our easy-to-use platform allows healthcare professionals to quickly interpret ECG signals, saving time, money, and lives! Get accurate readings. The EAVO algorithm and MIT-BIH dataset provide an effective solution to this problem. With the proposed filter built using EAVO, businesses can attain significant enhancements in reliable parameters and obtain accurate testing results in terms of SNR, MD, MSE and NRMSE

    Using the redundant convolutional encoder–decoder to denoise QRS complexes in ECG signals recorded with an armband wearable device

    Get PDF
    Long-term electrocardiogram (ECG) recordings while performing normal daily routines are often corrupted with motion artifacts, which in turn, can result in the incorrect calculation of heart rates. Heart rates are important clinical information, as they can be used for analysis of heart-rate variability and detection of cardiac arrhythmias. In this study, we present an algorithm for denoising ECG signals acquired with a wearable armband device. The armband was worn on the upper left arm by one male participant, and we simultaneously recorded three ECG channels for 24 h. We extracted 10-s sequences from armband recordings corrupted with added noise and motion artifacts. Denoising was performed using the redundant convolutional encoder–decoder (R-CED), a fully convolutional network. We measured the performance by detecting R-peaks in clean, noisy, and denoised sequences and by calculating signal quality indices: signal-to-noise ratio (SNR), ratio of power, and cross-correlation with respect to the clean sequences. The percent of correctly detected R-peaks in denoised sequences was higher than in sequences corrupted with either added noise (70–100% vs. 34–97%) or motion artifacts (91.86% vs. 61.16%). There was notable improvement in SNR values after denoising for signals with noise added (7–19 dB), and when sequences were corrupted with motion artifacts (0.39 dB). The ratio of power for noisy sequences was significantly lower when compared to both clean and denoised sequences. Similarly, cross-correlation between noisy and clean sequences was significantly lower than between denoised and clean sequences. Moreover, we tested our denoising algorithm on 60-s sequences extracted from recordings from the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) arrhythmia database and obtained improvement in SNR values of 7.08 ± 0.25 dB (mean ± standard deviation (sd)). These results from a diverse set of data suggest that the proposed denoising algorithm improves the quality of the signal and can potentially be applied to most ECG measurement devices

    ECG signal denoising using a novel approach of adaptive filters for real-time processing

    Get PDF
    Electrocardiogram (ECG) is considered as the main signal that can be used to diagnose different kinds of diseases related to human heart. During the recording process, it is usually contaminated with different kinds of noise which includes power-line interference, baseline wandering and muscle contraction. In order to clean the ECG signal, several noise removal techniques have been used such as adaptive filters, empirical mode decomposition, Hilbert-Huang transform, wavelet-based algorithm, discrete wavelet transforms, modulus maxima of wavelet transform, patch based method, and many more. Unfortunately, all the presented methods cannot be used for online processing since it takes long time to clean the ECG signal. The current research presents a unique method for ECG denoising using a novel approach of adaptive filters. The suggested method was tested by using a simulated signal using MATLAB software under different scenarios. Instead of using a reference signal for ECG signal denoising, the presented model uses a unite delay and the primary ECG signal itself. Least mean square (LMS), normalized least mean square (NLMS), and Leaky LMS were used as adaptation algorithms in this paper

    A robust ECG denoising technique using variable frequency complex demodulation

    Get PDF
    Background and Objective Electrocardiogram (ECG) is widely used for the detection and diagnosis of cardiac arrhythmias such as atrial fibrillation. Most of the computer-based automatic cardiac abnormality detection algorithms require accurate identification of ECG components such as QRS complexes in order to provide a reliable result. However, ECGs are often contaminated by noise and artifacts, especially if they are obtained using wearable sensors, therefore, identification of accurate QRS complexes often becomes challenging. Most of the existing denoising methods were validated using simulated noise added to a clean ECG signal and they did not consider authentically noisy ECG signals. Moreover, many of them are model-dependent and sampling-frequency dependent and require a large amount of computational time. Methods This paper presents a novel ECG denoising technique using the variable frequency complex demodulation (VFCDM) algorithm, which considers noises from a variety of sources. We used the sub-band decomposition of the noise-contaminated ECG signals using VFCDM to remove the noise components so that better-quality ECGs could be reconstructed. An adaptive automated masking is proposed in order to preserve the QRS complexes while removing the unnecessary noise components. Finally, the ECG was reconstructed using a dynamic reconstruction rule based on automatic identification of the severity of the noise contamination. The ECG signal quality was further improved by removing baseline drift and smoothing via adaptive mean filtering. Results Evaluation results on the standard MIT-BIH Arrhythmia database suggest that the proposed denoising technique provides superior denoising performance compared to studies in the literature. Moreover, the proposed method was validated using real-life noise sources collected from the noise stress test database (NSTDB) and data from an armband ECG device which contains significant muscle artifacts. Results from both the wearable armband ECG data and NSTDB data suggest that the proposed denoising method provides significantly better performance in terms of accurate QRS complex detection and signal to noise ratio (SNR) improvement when compared to some of the recent existing denoising algorithms. Conclusions The detailed qualitative and quantitative analysis demonstrated that the proposed denoising method has been robust in filtering varieties of noises present in the ECG. The QRS detection performance of the denoised armband ECG signals indicates that the proposed denoising method has the potential to increase the amount of usable armband ECG data, thus, the armband device with the proposed denoising method could be used for long term monitoring of atrial fibrillation

    A novel wavelet-based filtering strategy to remove powerline interference from electrocardiograms with atrial fibrillation

    Full text link
    This is an author-created, un-copyedited versíon of an article published in Physiological Measurement. IOP Publishing Ltd is not responsíble for any errors or omissíons in this versíon of the manuscript or any versíon derived from it. The Versíon of Record is available online at http://doi.org/10.1088/1361-6579/aae8b1[EN] Objective: The electrocardiogram (ECG) is currently the most widely used recording to diagnose cardiac disorders, including the most common supraventricular arrhythmia, such as atrial fibrillation (AF). However, different types of electrical disturbances, in which power-line interference (PLI) is a major problem, can mask and distort the original ECG morphology. This is a significant issue in the context of AF, because accurate characterization of fibrillatory waves (f-waves) is unavoidably required to improve current knowledge about its mechanisms. This work introduces a new algorithm able to reduce high levels of PLI and preserve, simultaneously, the original ECG morphology. Approach: The method is based on stationary wavelet transform shrinking and makes use of a new thresholding function designed to work successfully in a wide variety of scenarios. In fact, it has been validated in a general context with 48 ECG recordings obtained from pathological and non-pathological conditions, as well as in the particular context of AF, where 380 synthesized and 20 long-term real ECG recordings were analyzed. Main results: In both situations, the algorithm has reported a notably better performance than common methods designed for the same purpose. Moreover, its effectiveness has proven to be optimal for dealing with ECG recordings affected by AF, sincef-waves remained almost intact after removing very high levels of noise. Significance: The proposed algorithm may facilitate a reliable characterization of thef-waves, preventing them from not being masked by the PLI nor distorted by an unsuitable filtering applied to ECG recordings with AF.Research supported by grants DPI2017-83952-C3 MINECO/AEI/FEDER, UE and SBPLY/17/180501/000411 from Junta de Comunidades de Castilla-La Mancha.García, M.; Martínez, M.; Ródenas, J.; Rieta, JJ.; Alcaraz, R. (2018). A novel wavelet-based filtering strategy to remove powerline interference from electrocardiograms with atrial fibrillation. Physiological Measurement. 39(11):1-15. https://doi.org/10.1088/1361-6579/aae8b1S115391

    Characterization, Classification, and Genesis of Seismocardiographic Signals

    Get PDF
    Seismocardiographic (SCG) signals are the acoustic and vibration induced by cardiac activity measured non-invasively at the chest surface. These signals may offer a method for diagnosing and monitoring heart function. Successful classification of SCG signals in health and disease depends on accurate signal characterization and feature extraction. In this study, SCG signal features were extracted in the time, frequency, and time-frequency domains. Different methods for estimating time-frequency features of SCG were investigated. Results suggested that the polynomial chirplet transform outperformed wavelet and short time Fourier transforms. Many factors may contribute to increasing intrasubject SCG variability including subject posture and respiratory phase. In this study, the effect of respiration on SCG signal variability was investigated. Results suggested that SCG waveforms can vary with lung volume, respiratory flow direction, or a combination of these criteria. SCG events were classified into groups belonging to these different respiration phases using classifiers, including artificial neural networks, support vector machines, and random forest. Categorizing SCG events into different groups containing similar events allows more accurate estimation of SCG features. SCG feature points were also identified from simultaneous measurements of SCG and other well-known physiologic signals including electrocardiography, phonocardiography, and echocardiography. Future work may use this information to get more insights into the genesis of SCG

    Motion Artifact Processing Techniques for Physiological Signals

    Get PDF
    The combination of reducing birth rate and increasing life expectancy continues to drive the demographic shift toward an ageing population and this is placing an ever-increasing burden on our healthcare systems. The urgent need to address this so called healthcare \time bomb" has led to a rapid growth in research into ubiquitous, pervasive and distributed healthcare technologies where recent advances in signal acquisition, data storage and communication are helping such systems become a reality. However, similar to recordings performed in the hospital environment, artifacts continue to be a major issue for these systems. The magnitude and frequency of artifacts can vary signicantly depending on the recording environment with one of the major contributions due to the motion of the subject or the recording transducer. As such, this thesis addresses the challenges of the removal of this motion artifact removal from various physiological signals. The preliminary investigations focus on artifact identication and the tagging of physiological signals streams with measures of signal quality. A new method for quantifying signal quality is developed based on the use of inexpensive accelerometers which facilitates the appropriate use of artifact processing methods as needed. These artifact processing methods are thoroughly examined as part of a comprehensive review of the most commonly applicable methods. This review forms the basis for the comparative studies subsequently presented. Then, a simple but novel experimental methodology for the comparison of artifact processing techniques is proposed, designed and tested for algorithm evaluation. The method is demonstrated to be highly eective for the type of artifact challenges common in a connected health setting, particularly those concerned with brain activity monitoring. This research primarily focuses on applying the techniques to functional near infrared spectroscopy (fNIRS) and electroencephalography (EEG) data due to their high susceptibility to contamination by subject motion related artifact. Using the novel experimental methodology, complemented with simulated data, a comprehensive comparison of a range of artifact processing methods is conducted, allowing the identication of the set of the best performing methods. A novel artifact removal technique is also developed, namely ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA), which provides the best results when applied on fNIRS data under particular conditions. Four of the best performing techniques were then tested on real ambulatory EEG data contaminated with movement artifacts comparable to those observed during in-home monitoring. It was determined that when analysing EEG data, the Wiener lter is consistently the best performing artifact removal technique. However, when employing the fNIRS data, the best technique depends on a number of factors including: 1) the availability of a reference signal and 2) whether or not the form of the artifact is known. It is envisaged that the use of physiological signal monitoring for patient healthcare will grow signicantly over the next number of decades and it is hoped that this thesis will aid in the progression and development of artifact removal techniques capable of supporting this growth

    BEMDEC: An Adaptive and Robust Methodology for Digital Image Feature Extraction

    Get PDF
    The intriguing study of feature extraction, and edge detection in particular, has, as a result of the increased use of imagery, drawn even more attention not just from the field of computer science but also from a variety of scientific fields. However, various challenges surrounding the formulation of feature extraction operator, particularly of edges, which is capable of satisfying the necessary properties of low probability of error (i.e., failure of marking true edges), accuracy, and consistent response to a single edge, continue to persist. Moreover, it should be pointed out that most of the work in the area of feature extraction has been focused on improving many of the existing approaches rather than devising or adopting new ones. In the image processing subfield, where the needs constantly change, we must equally change the way we think. In this digital world where the use of images, for variety of purposes, continues to increase, researchers, if they are serious about addressing the aforementioned limitations, must be able to think outside the box and step away from the usual in order to overcome these challenges. In this dissertation, we propose an adaptive and robust, yet simple, digital image features detection methodology using bidimensional empirical mode decomposition (BEMD), a sifting process that decomposes a signal into its two-dimensional (2D) bidimensional intrinsic mode functions (BIMFs). The method is further extended to detect corners and curves, and as such, dubbed as BEMDEC, indicating its ability to detect edges, corners and curves. In addition to the application of BEMD, a unique combination of a flexible envelope estimation algorithm, stopping criteria and boundary adjustment made the realization of this multi-feature detector possible. Further application of two morphological operators of binarization and thinning adds to the quality of the operator

    Sensors for Vital Signs Monitoring

    Get PDF
    Sensor technology for monitoring vital signs is an important topic for various service applications, such as entertainment and personalization platforms and Internet of Things (IoT) systems, as well as traditional medical purposes, such as disease indication judgments and predictions. Vital signs for monitoring include respiration and heart rates, body temperature, blood pressure, oxygen saturation, electrocardiogram, blood glucose concentration, brain waves, etc. Gait and walking length can also be regarded as vital signs because they can indirectly indicate human activity and status. Sensing technologies include contact sensors such as electrocardiogram (ECG), electroencephalogram (EEG), photoplethysmogram (PPG), non-contact sensors such as ballistocardiography (BCG), and invasive/non-invasive sensors for diagnoses of variations in blood characteristics or body fluids. Radar, vision, and infrared sensors can also be useful technologies for detecting vital signs from the movement of humans or organs. Signal processing, extraction, and analysis techniques are important in industrial applications along with hardware implementation techniques. Battery management and wireless power transmission technologies, the design and optimization of low-power circuits, and systems for continuous monitoring and data collection/transmission should also be considered with sensor technologies. In addition, machine-learning-based diagnostic technology can be used for extracting meaningful information from continuous monitoring data
    • …
    corecore