5 research outputs found

    A Signal Decomposition Model-Based Bayesian Framework for ECG Components Separation

    Get PDF
    The paper introduces an improved signal decomposition model-based Bayesian framework (EKS6). While it can be employed for multiple purposes, like denoising and features extraction, it is particularly suited for extracting electrocardiogram (ECG) wave-forms from ECG recordings. In this framework, the ECG is represented as the sum of several components, each describing a specific wave (i.e., P, Q, R, S, and T), with a corresponding term in the dynamical model. Characteristic Waveforms (CWs) of the ECG components are taken as hidden state variables, distinctly estimated using a Kalman smoother from sample to sample. Then, CWs can be analyzed separately, accordingly to a specific application. The new dynamical model no longer depends on the amplitude of the Gaussian kernels, so it is capable of separating ECG components even if sudden changes in the CWs appear (e.g., an ectopic beat). Results, obtained on synthetic signals with different levels of noise, showed that the proposed method is indeed more effective in separating the ECG components when compared with another framework recently introduced with the same aims (EKS4). The proposed approach can be used for many applications. In this paper, we verified it for T/QRS ratio calculation. For this purpose, we applied it to 288 signals from the PhysioNet PTB Diagnostic ECG Database. The values of RMSE obtained show that the T/QRS ratio computed on the components extracted from the ECG, corrupted by broadband noise, is closer to the original T/QRS ratio values (RMSE=0.025 for EKS6 and 0.17 for EKS4)

    ADAPTIVE MODELS-BASED CARDIAC SIGNALS ANALYSIS AND FEATURE EXTRACTION

    Get PDF
    Signal modeling and feature extraction are among the most crucial and important steps for stochastic signal processing. In this thesis, a general framework that employs adaptive model-based recursive Bayesian state estimation for signal processing and feature extraction is described. As a case study, the proposed framework is studied for the problem of cardiac signal analysis. The main objective is to improve the signal processing aspects of cardiac signals by developing new techniques based on adaptive modelling of electrocardiogram (ECG) wave-forms. Specially several novel and improved approaches to model-based ECG decomposition, waveform characterization and feature extraction are proposed and studied in detail. In the concept of ECG decomposition and wave-forms characterization, the main idea is to extend and improve the signal dynamical models (i.e. reducing the non-linearity of the state model with respect to previous solutions) while combining with Kalman smoother to increase the accuracy of the model in order to split the ECG signal into its waveform components, as it is proved that Kalman filter/smoother is an optimal estimator in minimum mean square error (MMSE) for linear dynamical systems. The framework is used for many real applications, such as: ECG components extraction, ST segment analysis (estimation of a possible marker of ventricular repolarization known as T/QRS ratio) and T-wave Alternans (TWA) detection, and its extension to many other applications is straightforward. Based on the proposed framework, a novel model to characterization of Atrial Fibrillation (AF) is presented which is more effective when compared with other methods proposed with the same aims. In this model, ventricular activity (VA) is represented by a sum of Gaussian kernels, while a sinusoidal model is employed for atrial activity (AA). This new model is able to track AA, VA and fibrillatory frequency simultaneously against other methods which try to analyze the atrial fibrillatory waves (f-waves) after VA cancellation. Furthermore we study a new ECG processing method for assessing the spatial dispersion of ventricular repolarization (SHVR) using V-index and a novel algorithm to estimate the index is presented, leading to more accurate estimates. The proposed algorithm was used to study the diagnostic and prognostic value of the V-index in patients with symptoms suggestive of Acute Myocardial Infraction (AMI)

    Advances in Waveform and Photon Counting Lidar Processing for Forest Vegetation Applications

    Get PDF
    Full waveform (FW) and photon counting LiDAR (PCL) data have garnered greater attention due to increasing data availability, a wealth of information they contain and promising prospects for large scale vegetation mapping. However, many factors such as complex processing steps and scarce non-proprietary tools preclude extensive and practical uses of these data for vegetation characterization. Therefore, the overall goal of this study is to develop algorithms to process FW and PCL data and to explore their potential in real-world applications. Study I explored classical waveform decomposition methods such as the Gaussian decomposition, Richardson–Lucy (RL) deconvolution and a newly introduced optimized Gold deconvolution to process FW LiDAR data. Results demonstrated the advantages of the deconvolution and decomposition method, and the three approaches generated satisfactory results, while the best performances varied when different criteria were used. Built upon Study I, Study II applied the Bayesian non-linear modeling concepts for waveform decomposition and quantified the propagation of error and uncertainty along the processing steps. The performance evaluation and uncertainty analysis at the parameter, derived point cloud and surface model levels showed that the Bayesian decomposition could enhance the credibility of decomposition results in a probabilistic sense to capture the true error of estimates and trace the uncertainty propagation along the processing steps. In study III, we exploited FW LiDAR data to classify tree species through integrating machine learning methods (the Random forests (RF) and Conditional inference forests (CF)) and Bayesian inference method. Results of classification accuracy highlighted that the Bayesian method was a superior alternative to machine learning methods, and rendered users with more confidence for interpreting and applying classification results to real-world tasks such as forest inventory. Study IV focused on developing a framework to derive terrain elevation and vegetation canopy height from test-bed sensor data and to pre-validate the capacity of the upcoming Ice, Cloud and Land Elevation Satellite-2 (ICESat-2) mission. The methodology developed in this study illustrates plausible ways of processing the data that are structurally similar to expected ICESat-2 data and holds the potential to be a benchmark for further method adjustment once genuine ICESat-2 are available

    Extraction and Detection of Fetal Electrocardiograms from Abdominal Recordings

    Get PDF
    The non-invasive fetal ECG (NIFECG), derived from abdominal surface electrodes, offers novel diagnostic possibilities for prenatal medicine. Despite its straightforward applicability, NIFECG signals are usually corrupted by many interfering sources. Most significantly, by the maternal ECG (MECG), whose amplitude usually exceeds that of the fetal ECG (FECG) by multiple times. The presence of additional noise sources (e.g. muscular/uterine noise, electrode motion, etc.) further affects the signal-to-noise ratio (SNR) of the FECG. These interfering sources, which typically show a strong non-stationary behavior, render the FECG extraction and fetal QRS (FQRS) detection demanding signal processing tasks. In this thesis, several of the challenges regarding NIFECG signal analysis were addressed. In order to improve NIFECG extraction, the dynamic model of a Kalman filter approach was extended, thus, providing a more adequate representation of the mixture of FECG, MECG, and noise. In addition, aiming at the FECG signal quality assessment, novel metrics were proposed and evaluated. Further, these quality metrics were applied in improving FQRS detection and fetal heart rate estimation based on an innovative evolutionary algorithm and Kalman filtering signal fusion, respectively. The elaborated methods were characterized in depth using both simulated and clinical data, produced throughout this thesis. To stress-test extraction algorithms under ideal circumstances, a comprehensive benchmark protocol was created and contributed to an extensively improved NIFECG simulation toolbox. The developed toolbox and a large simulated dataset were released under an open-source license, allowing researchers to compare results in a reproducible manner. Furthermore, to validate the developed approaches under more realistic and challenging situations, a clinical trial was performed in collaboration with the University Hospital of Leipzig. Aside from serving as a test set for the developed algorithms, the clinical trial enabled an exploratory research. This enables a better understanding about the pathophysiological variables and measurement setup configurations that lead to changes in the abdominal signal's SNR. With such broad scope, this dissertation addresses many of the current aspects of NIFECG analysis and provides future suggestions to establish NIFECG in clinical settings.:Abstract Acknowledgment Contents List of Figures List of Tables List of Abbreviations List of Symbols (1)Introduction 1.1)Background and Motivation 1.2)Aim of this Work 1.3)Dissertation Outline 1.4)Collaborators and Conflicts of Interest (2)Clinical Background 2.1)Physiology 2.1.1)Changes in the maternal circulatory system 2.1.2)Intrauterine structures and feto-maternal connection 2.1.3)Fetal growth and presentation 2.1.4)Fetal circulatory system 2.1.5)Fetal autonomic nervous system 2.1.6)Fetal heart activity and underlying factors 2.2)Pathology 2.2.1)Premature rupture of membrane 2.2.2)Intrauterine growth restriction 2.2.3)Fetal anemia 2.3)Interpretation of Fetal Heart Activity 2.3.1)Summary of clinical studies on FHR/FHRV 2.3.2)Summary of studies on heart conduction 2.4)Chapter Summary (3)Technical State of the Art 3.1)Prenatal Diagnostic and Measuring Technique 3.1.1)Fetal heart monitoring 3.1.2)Related metrics 3.2)Non-Invasive Fetal ECG Acquisition 3.2.1)Overview 3.2.2)Commercial equipment 3.2.3)Electrode configurations 3.2.4)Available NIFECG databases 3.2.5)Validity and usability of the non-invasive fetal ECG 3.3)Non-Invasive Fetal ECG Extraction Methods 3.3.1)Overview on the non-invasive fetal ECG extraction methods 3.3.2)Kalman filtering basics 3.3.3)Nonlinear Kalman filtering 3.3.4)Extended Kalman filter for FECG estimation 3.4)Fetal QRS Detection 3.4.1)Merging multichannel fetal QRS detections 3.4.2)Detection performance 3.5)Fetal Heart Rate Estimation 3.5.1)Preprocessing the fetal heart rate 3.5.2)Fetal heart rate statistics 3.6)Fetal ECG Morphological Analysis 3.7)Problem Description 3.8)Chapter Summary (4)Novel Approaches for Fetal ECG Analysis 4.1)Preliminary Considerations 4.2)Fetal ECG Extraction by means of Kalman Filtering 4.2.1)Optimized Gaussian approximation 4.2.2)Time-varying covariance matrices 4.2.3)Extended Kalman filter with unknown inputs 4.2.4)Filter calibration 4.3)Accurate Fetal QRS and Heart Rate Detection 4.3.1)Multichannel evolutionary QRS correction 4.3.2)Multichannel fetal heart rate estimation using Kalman filters 4.4)Chapter Summary (5)Data Material 5.1)Simulated Data 5.1.1)The FECG Synthetic Generator (FECGSYN) 5.1.2)The FECG Synthetic Database (FECGSYNDB) 5.2)Clinical Data 5.2.1)Clinical NIFECG recording 5.2.2)Scope and limitations of this study 5.2.3)Data annotation: signal quality and fetal amplitude 5.2.4)Data annotation: fetal QRS annotation 5.3)Chapter Summary (6)Results for Data Analysis 6.1)Simulated Data 6.1.1)Fetal QRS detection 6.1.2)Morphological analysis 6.2)Own Clinical Data 6.2.1)FQRS correction using the evolutionary algorithm 6.2.2)FHR correction by means of Kalman filtering (7)Discussion and Prospective 7.1)Data Availability 7.1.1)New measurement protocol 7.2)Signal Quality 7.3)Extraction Methods 7.4)FQRS and FHR Correction Algorithms (8)Conclusion References (A)Appendix A - Signal Quality Annotation (B)Appendix B - Fetal QRS Annotation (C)Appendix C - Data Recording GU

    Advances in Waveform and Photon Counting Lidar Processing for Forest Vegetation Applications

    Get PDF
    Full waveform (FW) and photon counting LiDAR (PCL) data have garnered greater attention due to increasing data availability, a wealth of information they contain and promising prospects for large scale vegetation mapping. However, many factors such as complex processing steps and scarce non-proprietary tools preclude extensive and practical uses of these data for vegetation characterization. Therefore, the overall goal of this study is to develop algorithms to process FW and PCL data and to explore their potential in real-world applications. Study I explored classical waveform decomposition methods such as the Gaussian decomposition, Richardson–Lucy (RL) deconvolution and a newly introduced optimized Gold deconvolution to process FW LiDAR data. Results demonstrated the advantages of the deconvolution and decomposition method, and the three approaches generated satisfactory results, while the best performances varied when different criteria were used. Built upon Study I, Study II applied the Bayesian non-linear modeling concepts for waveform decomposition and quantified the propagation of error and uncertainty along the processing steps. The performance evaluation and uncertainty analysis at the parameter, derived point cloud and surface model levels showed that the Bayesian decomposition could enhance the credibility of decomposition results in a probabilistic sense to capture the true error of estimates and trace the uncertainty propagation along the processing steps. In study III, we exploited FW LiDAR data to classify tree species through integrating machine learning methods (the Random forests (RF) and Conditional inference forests (CF)) and Bayesian inference method. Results of classification accuracy highlighted that the Bayesian method was a superior alternative to machine learning methods, and rendered users with more confidence for interpreting and applying classification results to real-world tasks such as forest inventory. Study IV focused on developing a framework to derive terrain elevation and vegetation canopy height from test-bed sensor data and to pre-validate the capacity of the upcoming Ice, Cloud and Land Elevation Satellite-2 (ICESat-2) mission. The methodology developed in this study illustrates plausible ways of processing the data that are structurally similar to expected ICESat-2 data and holds the potential to be a benchmark for further method adjustment once genuine ICESat-2 are available
    corecore