781 research outputs found

    ECG Noise Filtering Using Online Model-Based Bayesian Filtering Techniques

    Get PDF
    The electrocardiogram (ECG) is a time-varying electrical signal that interprets the electrical activity of the heart. It is obtained by a non-invasive technique known as surface electromyography (EMG), used widely in hospitals. There are many clinical contexts in which ECGs are used, such as medical diagnosis, physiological therapy and arrhythmia monitoring. In medical diagnosis, medical conditions are interpreted by examining information and features in ECGs. Physiological therapy involves the control of some aspect of the physiological effort of a patient, such as the use of a pacemaker to regulate the beating of the heart. Moreover, arrhythmia monitoring involves observing and detecting life-threatening conditions, such as myocardial infarction or heart attacks, in a patient. ECG signals are usually corrupted with various types of unwanted interference such as muscle artifacts, electrode artifacts, power line noise and respiration interference, and are distorted in such a way that it can be difficult to perform medical diagnosis, physiological therapy or arrhythmia monitoring. Consequently signal processing on ECGs is required to remove noise and interference signals for successful clinical applications. Existing signal processing techniques can remove some of the noise in an ECG signal, but are typically inadequate for extraction of the weak ECG components contaminated with background noise and for retention of various subtle features in the ECG. For example, the noise from the EMG usually overlaps the fundamental ECG cardiac components in the frequency domain, in the range of 0.01 Hz to 100 Hz. Simple filters are inadequate to remove noise which overlaps with ECG cardiac components. Sameni et al. have proposed a Bayesian filtering framework to resolve these problems, and this gives results which are clearly superior to the results obtained from application of conventional signal processing methods to ECG. However, a drawback of this Bayesian filtering framework is that it must run offline, and this of course is not desirable for clinical applications such as arrhythmia monitoring and physiological therapy, both of which re- quire online operation in near real-time. To resolve this problem, in this thesis we propose a dynamical model which permits the Bayesian filtering framework to function online. The framework with the proposed dynamical model has less than 4% loss in performance compared to the previous (offline) version of the framework. The proposed dynamical model is based on theory from fixed-lag smoothing

    The Impact of Torso Signal Processing on Noninvasive Electrocardiographic Imaging Reconstructions

    Get PDF
    Goal: To evaluate state-of-the-art signal processing methods for epicardial potential-based noninvasive electrocardiographic imaging reconstructions of single-site pacing data. Methods: Experimental data were obtained from two torso-tank setups in which Langendorff-perfused hearts (n = 4) were suspended and potentials recorded simultaneously from torso and epicardial surfaces. 49 different signal processing methods were applied to torso potentials, grouped as i) high-frequency noise removal (HFR) methods ii) baseline drift removal (BDR) methods and iii) combined HFR+BDR. The inverse problem was solved and reconstructed electrograms and activation maps compared to those directly recorded. Results: HFR showed no difference compared to not filtering in terms of absolute differences in reconstructed electrogram amplitudes nor median correlation in QRS waveforms (p > 0.05). However, correlation and mean absolute error of activation times and pacing site localization were improved with all methods except a notch filter. HFR applied post-reconstruction produced no differences compared to pre-reconstruction. BDR and BDR+HFR significantly improved absolute and relative difference, and correlation in electrograms (p < 0.05). While BDR+HFR combined improved activation time and pacing site detection, BDR alone produced significantly lower correlation and higher localization errors (p < 0.05). Conclusion: BDR improves reconstructed electrogram morphologies and amplitudes due to a reduction in lambda value selected for the inverse problem. The simplest method (resetting the isoelectric point) is sufficient to see these improvements. HFR does not impact electrogram accuracy, but does impact post-processing to extract features such as activation times. Removal of line noise is insufficient to see these changes. HFR should be applied post-reconstruction to ensure over-filtering does not occur

    An algorithm for extracting the PPG Baseline Drift in real-time

    Get PDF
    Photoplethysmography is an optical technique for measuring the perfusion of blood in skin and tissue arterial vessels. Due to its simplicity, accessibility and abundance of information on an individual’s cardiovascular system, it has been a pervasive topic of research within recent years. With these benefits however there are many challenges concerning the processing and conditioning of the signal in order to allow information to be extracted. One such challenge is removing the baseline drift of the signal, which is caused by respiratory rate, muscle tremor and physiological changes within the body as a response to various stimuli. Over the years there have been many methods developed in order to condition the signal such as Wavelet Transform, Cubic Spline Interpolation, Morphological Operators and Fourier-Based filtering techniques. All have their own individual benefits and drawbacks. These drawbacks are that they are unsuitable for real-time usage due to the computation power needed, or have the trade-off of being real-time at the cost of deforming the signal which is unideal for accurate analysis. This thesis aims to explore these techniques in order to develop an algorithm that can be used to condition the signal against the baseline drift in real-time, while being able to achieve good computational efficiency and the preservation of the signal form

    Bottom-up design of artificial neural network for single-lead electrocardiogram beat and rhythm classification

    Get PDF
    Performance improvement in computerized Electrocardiogram (ECG) classification is vital to improve reliability in this life-saving technology. The non-linearly overlapping nature of the ECG classification task prevents the statistical and the syntactic procedures from reaching the maximum performance. A new approach, a neural network-based classification scheme, has been implemented in clinical ECG problems with much success. The focus, however, has been on narrow clinical problem domains and the implementations lacked engineering precision. An optimal utilization of frequency information was missing. This dissertation attempts to improve the accuracy of neural network-based single-lead (lead-II) ECG beat and rhythm classification. A bottom-up approach defined in terms of perfecting individual sub-systems to improve the over all system performance is used. Sub-systems include pre-processing, QRS detection and fiducial point estimations, feature calculations, and pattern classification. Inaccuracies in time-domain fiducial point estimations are overcome with the derivation of features in the frequency domain. Feature extraction in frequency domain is based on a spectral estimation technique (combination of simulation and subtraction of a normal beat). Auto-regressive spectral estimation methods yield a highly sensitive spectrum, providing several local features with information on beat classes like flutter, fibrillation, and noise. A total of 27 features, including 16 in time domain and 11 in frequency domain are calculated. The entire data and problem are divided into four major groups, each group with inter-related beat classes. Classification of each group into related sub-classes is performed using smaller feed-forward neural networks. Input feature sub-set and the structure of each network are optimized using an iterative process. Optimal implementations of feed-forward neural networks provide high accuracy in beat classification. Associated neural networks are used for the more deterministic rhythm-classification task. An accuracy of more than 85% is achieved for all 13 classes included in this study. The system shows a graceful degradation in performance with increasing noise, as a result of the noise consideration in the design of every sub-system. Results indicate a neural network-based bottom-up design of single-lead ECG classification is able to provide very high accuracy, even in the presence of noise, flutter, and fibrillation

    The severity of stages estimation during hemorrhage using error correcting output codes method

    Get PDF
    As a beneficial component with critical impact, computer-aided decision making systems have infiltrated many fields, such as economics, medicine, architecture and agriculture. The latent capabilities for facilitating human work propel high-speed development of such systems. Effective decisions provided by such systems greatly reduce the expense of labor, energy, budget, etc. The computer-aided decision making system for traumatic injuries is one type of such systems that supplies suggestive opinions when dealing with the injuries resulted from accidents, battle, or illness. The functions may involve judging the type of illness, allocating the wounded according to battle injuries, deciding the severity of symptoms for illness or injuries, managing the resources in the context of traumatic events, etc. The proposed computer-aided decision making system aims at estimating the severity of blood volume loss. Specifically speaking, accompanying many traumatic injuries, severe hemorrhage, a potentially life-threatening condition that requires immediate treatment, is a significant loss of blood volume in process resulting in decreased blood and oxygen perfusion of vital organs. Hemorrhage and blood loss can occur in different levels such as mild, moderate, or severe. Our proposed system will assist physicians by estimating information such as the severity of blood volume loss and hemorrhage , so that timely measures can be taken to not only save lives but also reduce the long-term complications as well as the cost caused by unmatched operations and treatments. The general framework of the proposed research contains three tasks and many novel and transformative concepts are integrated into the system. First is the preprocessing of the raw signals. In this stage, adaptive filtering is adopted and customized to filter noise, and two detection algorithms (QRS complex detection and Systolic/Diastolic wave detection) are designed. The second process is to extract features. The proposed system combines features from time domain, frequency domain, nonlinear analysis, and multi-model analysis to better represent the patterns when hemorrhage happens. Third, a machine learning algorithm is designed for classification of patterns. A novel machine learning algorithm, as a new version of error correcting output code (ECOC), is designed and investigated for high accuracy and real-time decision making. The features and characteristics of this machine learning method are essential for the proposed computer-aided trauma decision making system. The proposed system is tested agasint Lower Body Negative Pressure (LBNP) dataset, and the results indicate the accuracy and reliability of the proposed system

    An Improved Firefly Optimization Algorithm for Analysis of Arrhythmia Types

    Get PDF
    Irregular heartbeats rhythm is the result of arrhythmia condition which can be a threat to life if not treated at the early stage. If it is necessary to know the type of arrhythmia to treat the patient appropriately. The traditional method is complex and an efficient algorithm is required to diagnose. An improved firefly optimization algorithm is proposed to analyze arrhythmia types. Four performance measures confirm the model's effectiveness and experimental evaluation shows that it achieves a sensitivity of 86.27%, accuracy of 86.14%, precision of 87.52%, and specificity of 87.37% in arrhythmia-type classification. The algorithm can effectively classify the arrhythmia types with high accuracy and specificity

    Structural health monitoring meets data mining

    Get PDF
    With the development of sensing and data processing techniques, monitoring physical systems in the field with a sensor network is becoming a feasible option for many domains. Such monitoring systems are referred to as Structural Health Monitoring (SHM) systems. By definition, SHM is the process of implementing a damage detection and characterisation strategy for engineering structures, which involves data collection, damage-sensitive feature extraction and statistical analysis. Most of the SHM process can be addressed by techniques from the Data Mining domain, so I conduct this research by combining these two fields. The monitoring system employed in this research is a sensor network installed on a Dutch highway bridge, which aims to monitor dynamic health aspects of the bridge and its long-term degradation. I have explored the specific focus of each sensor type under multiple scales, and analysed the dependencies between sensor types. Based on landmarks and constraints, I have proposed a novel predefined pattern detection method to select traffic events for modal analysis. I have analysed the influence of temperature and traffic mass on natural frequencies, and verified that natural frequencies decrease with temperature increases, but the influence of traffic mass is weaker than that of temperature.Chinese CSC Dutch STWAlgorithms and the Foundations of Software technolog
    corecore