82 research outputs found
Development of a sub-miniature acoustic sensor for wireless monitoring of heart rate
This thesis presents the development of a non-invasive, wireless, low-power, phonocardiographic (PCG) or heart sound sensor platform suitable for long-term monitoring of heart function. The core of this development process involves a study of the feasibility of this conceptual system and the development of a prototype mixed-signals integrated circuit (IC) to form the integral component of the proposed sensor.
The feasibility study of the proposed long-term monitoring sensor is divided into two main parts. The first part of the study investigates the technological aspect of the conceptual system, via a system level design. This is to prove the technological or operational feasibility of the system, where the system can be built completely using discrete, off-the-shelf electronics components to satisfy the size, power consumption, battery life and operational requirements of the sensor platform. The second part of the study concentrates on the post-processing of the heart sounds and murmurs or PCG data recorded. This is where a number of different de-noising algorithms are studied and their relative performance compared when applied to a variety of different noisy heart sound signals that would likely be acquired using the proposed sensor in everyday life. This was done to demonstrate the functional feasibility of the proposed system, where the ambient acoustic noise in the recorded PCG data can be effectively suppressed and therefore meaningful analysis of heart function i.e. heart rate, can be performed on the data.
After the feasibility of the conceptual system has been demonstrated, the final part of this thesis discusses the synthesis and testing of a 0.35 μm CMOS technology prototype mixed analog-digital integrated circuit (IC) to miniaturise part of this sensor platform outlined in the system level design, conducted in the earlier part of this thesis, to achieve the objective specifications – in terms of the size and power consumption. A new implementation of the multi-tanh triplet transconductor is introduced to construct a pair of 100 nW analogue 4th order Gm-C signal conditioning filters. Furthermore, a 7 μW digital circuit was designed to drive the analog-to-digital conversion cycle of the Linear Technology LTC1288 ADC and synchronise the ADC’s output to generate the Manchester encoded data compatible with the Holt Integrated Circuit HI-15530 Manchester Encoder/Decoder
Generative Adversarial Network for Photoplethysmography Reconstruction
Photoplethysmography (PPG) is an optical measurement method for blood pulse wave monitoring. The method has been widely applied in both clinical and wearable devices to collect physiological parameters, such as heart rate (HR) and heart rate variability (HRV). Unfortunately, the PPG signals are very vulnerable to motion artifacts, caused by inevitable movements of human users. To obtain reliable results from PPG-based monitoring, methods to denoise the PPG signals are necessary.
Methods proposed in the literature, including signal decomposition, time-series analysis, and deep-learning based methods, reduce the effect of noise in PPG signals. However, their performance is insufficient for low signal-to-noise ratio PPG signals, or limited to noise from certain types of activities. Therefore, the aim of this study is to develop a method to remove the motion artifacts and reconstruct noisy PPG signals without any prior knowledge about the noise.
In this thesis, a deep convolutional generative adversarial network (DC-GAN) based method is proposed to reconstruct the PPG signals corrupted by real-world motion artifacts. The proposed method leverages the temporal information from the distorted signal and its preceding data points to obtain the clean PPG signal. A GAN-based model is trained to generate succeeding clean PPG signals by previous data points. A sliding window moving at a fixed step on the noisy signal is used to select and update the input for the trained model by the information within the noisy signal. A PPG dataset collected by smartwatches in a health monitoring study is used to train, validate, and test the method in this study. A noisy dataset generated with real-world motion artifacts of different noise levels and lengths is used to evaluate the proposed and baseline methods. Three state-of-the-art PPG reconstruction methods are compared with our method. Two metrics, including maximum peak-to-peak error and RMSSD error, are extracted from the original and reconstructed signals to estimate the reconstruction error for HR and HRV.
Our method outperforms state-of-the-art methods with the lowest values of the two evaluation matrices at all noise levels and lengths. The proposed method achieves 0.689, 1.352 and 1.821 seconds of maximum peak-to-peak errors for 5-second, 10-second, and 15-second noise at the highest noise level, respectively, and achieves 0.021, 0.048 and 0.067 seconds of RMSSD errors for the same noise cases.
Consequently, our method performs the best in reconstructing distorted PPG signals and provides reliable estimation for both HR and HRV
Development of a Novel Dataset and Tools for Non-Invasive Fetal Electrocardiography Research
This PhD thesis presents the development of a novel open multi-modal dataset
for advanced studies on fetal cardiological assessment, along with a set of signal
processing tools for its exploitation. The Non-Invasive Fetal Electrocardiography
(ECG) Analysis (NInFEA) dataset features multi-channel electrophysiological
recordings characterized by high sampling frequency and digital resolution,
maternal respiration signal, synchronized fetal trans-abdominal pulsed-wave
Doppler (PWD) recordings and clinical annotations provided by expert
clinicians at the time of the signal collection. To the best of our knowledge,
there are no similar dataset available.
The signal processing tools targeted both the PWD and the non-invasive
fetal ECG, exploiting the recorded dataset. About the former, the study focuses
on the processing aimed at the preparation of the signal for the automatic
measurement of relevant morphological features, already adopted in the
clinical practice for cardiac assessment. To this aim, a relevant step is the automatic
identification of the complete and measurable cardiac cycles in the PWD
videos: a rigorous methodology was deployed for the analysis of the different
processing steps involved in the automatic delineation of the PWD envelope,
then implementing different approaches for the supervised classification of the
cardiac cycles, discriminating between complete and measurable vs. malformed
or incomplete ones. Finally, preliminary measurement algorithms were also developed
in order to extract clinically relevant parameters from the PWD.
About the fetal ECG, this thesis concentrated on the systematic analysis of
the adaptive filters performance for non-invasive fetal ECG extraction processing,
identified as the reference tool throughout the thesis. Then, two studies
are reported: one on the wavelet-based denoising of the extracted fetal ECG
and another one on the fetal ECG quality assessment from the analysis of the
raw abdominal recordings.
Overall, the thesis represents an important milestone in the field, by promoting
the open-data approach and introducing automated analysis tools that
could be easily integrated in future medical devices
Recommended from our members
ECG analysis and classification using CSVM, MSVM and SIMCA classifiers
Reliable ECG classification can potentially lead to better detection methods and increase
accurate diagnosis of arrhythmia, thus improving quality of care. This thesis investigated the
use of two novel classification algorithms: CSVM and SIMCA, and assessed their
performance in classifying ECG beats. The project aimed to introduce a new way to
interactively support patient care in and out of the hospital and develop new classification
algorithms for arrhythmia detection and diagnosis. Wave (P-QRS-T) detection was performed
using the WFDB Software Package and multiresolution wavelets. Fourier and PCs were
selected as time-frequency features in the ECG signal; these provided the input to the
classifiers in the form of DFT and PCA coefficients. ECG beat classification was performed
using binary SVM. MSVM, CSVM, and SIMCA; these were subsequently used for
simultaneously classifying either four or six types of cardiac conditions. Binary SVM
classification with 100% accuracy was achieved when applied on feature-reduced ECG
signals from well-established databases using PCA. The CSVM algorithm and MSVM were
used to classify four ECG beat types: NORMAL, PVC, APC, and FUSION or PFUS; these
were from the MIT-BIH arrhythmia database (precordial lead group and limb lead II).
Different numbers of Fourier coefficients were considered in order to identify the optimal
number of features to be presented to the classifier. SMO was used to compute hyper-plane
parameters and threshold values for both MSVM and CSVM during the classifier training
phase. The best classification accuracy was achieved using fifty Fourier coefficients. With the
new CSVM classifier framework, accuracies of 99%, 100%, 98%, and 99% were obtained
using datasets from one, two, three, and four precordial leads, respectively. In addition, using
CSVM it was possible to successfully classify four types of ECG beat signals extracted from
limb lead simultaneously with 97% accuracy, a significant improvement on the 83% accuracy
achieved using the MSVM classification model. In addition, further analysis of the following
four beat types was made: NORMAL, PVC, SVPB, and FUSION. These signals were
obtained from the European ST-T Database. Accuracies between 86% and 94% were obtained
for MSVM and CSVM classification, respectively, using 100 Fourier coefficients for
reconstructing individual ECG beats. Further analysis presented an effective ECG arrhythmia
classification scheme consisting of PCA as a feature reduction method and a SIMCA
classifier to differentiate between either four or six different types of arrhythmia. In separate
studies, six and four types of beats (including NORMAL, PVC, APC, RBBB, LBBB, and
FUSION beats) with time domain features were extracted from the MIT-BIH arrhythmia
database and the St Petersburg INCART 12-lead Arrhythmia Database (incartdb) respectively.
Between 10 and 30 PCs, coefficients were selected for reconstructing individual ECG beats in
the feature selection phase. The average classification accuracy of the proposed scheme was
98.61% and 97.78 % using the limb lead and precordial lead datasets, respectively. In addition,
using MSVM and SIMCA classifiers with four ECG beat types achieved an average
classification accuracy of 76.83% and 98.33% respectively. The effectiveness of the proposed
algorithms was finally confirmed by successfully classifying both the six beat and four beat
types of signal respectively with a high accuracy ratio
In-situ health monitoring for wind turbine blade using acoustic wireless sensor networks at low sampling rates
PhD ThesisThe development of in-situ structural health monitoring (SHM) techniques represents a
challenge for offshore wind turbines (OWTs) in order to reduce the cost of the operation
and maintenance (O&M) of safety-critical components and systems. This thesis propos-
es an in-situ wireless SHM system based on acoustic emission (AE) techniques. The
proposed wireless system of AE sensor networks is not without its own challenges
amongst which are requirements of high sampling rates, limitations in the communication bandwidth, memory space, and power resources. This work is part of the HEMOW-
FP7 Project, ‘The Health Monitoring of Offshore Wind Farms’.
The present study investigates solutions relevant to the abovementioned challenges.
Two related topics have been considered: to implement a novel in-situ wireless SHM
technique for wind turbine blades (WTBs); and to develop an appropriate signal pro-
cessing algorithm to detect, localise, and classify different AE events. The major contri-
butions of this study can be summarised as follows: 1) investigating the possibility of
employing low sampling rates lower than the Nyquist rate in the data acquisition opera-
tion and content-based feature (envelope and time-frequency data analysis) for data
analysis; 2) proposing techniques to overcome drawbacks associated with lowering
sampling rates, such as information loss and low spatial resolution; 3) showing that the
time-frequency domain is an effective domain for analysing the aliased signals, and an
envelope-based wavelet transform cross-correlation algorithm, developed in the course
of this study, can enhance the estimation accuracy of wireless acoustic source localisa-
tion; 4) investigating the implementation of a novel in-situ wireless SHM technique
with field deployment on the WTB structure, and developing a constraint model and
approaches for localisation of AE sources and environmental monitoring respectively.
Finally, the system has been experimentally evaluated with the consideration of the lo-
calisation and classification of different AE events as well as changes of environmental
conditions. The study concludes that the in-situ wireless SHM platform developed in the
course of this research represents a promising technique for reliable SHM for OWTBs
in which solutions for major challenges, e.g., employing low sampling rates lower than
the Nyquist rate in the acquisition operation and resource constraints of WSNs in terms
of communication bandwidth and memory space are presente
Survey of FPGA applications in the period 2000 – 2015 (Technical Report)
Romoth J, Porrmann M, Rückert U. Survey of FPGA applications in the period 2000 – 2015 (Technical Report).; 2017.Since their introduction, FPGAs can be seen in more and more different fields of applications. The key advantage is the combination of software-like flexibility with the performance otherwise common to hardware. Nevertheless, every application field introduces special requirements to the used computational architecture. This paper provides an overview of the different topics FPGAs have been used for in the last 15 years of research and why they have been chosen over other processing units like e.g. CPUs
Sensor Signal and Information Processing II
In the current age of information explosion, newly invented technological sensors and software are now tightly integrated with our everyday lives. Many sensor processing algorithms have incorporated some forms of computational intelligence as part of their core framework in problem solving. These algorithms have the capacity to generalize and discover knowledge for themselves and learn new information whenever unseen data are captured. The primary aim of sensor processing is to develop techniques to interpret, understand, and act on information contained in the data. The interest of this book is in developing intelligent signal processing in order to pave the way for smart sensors. This involves mathematical advancement of nonlinear signal processing theory and its applications that extend far beyond traditional techniques. It bridges the boundary between theory and application, developing novel theoretically inspired methodologies targeting both longstanding and emergent signal processing applications. The topic ranges from phishing detection to integration of terrestrial laser scanning, and from fault diagnosis to bio-inspiring filtering. The book will appeal to established practitioners, along with researchers and students in the emerging field of smart sensors processing
- …