12 research outputs found
ARRHYTHMIA DETECTION BASED ON HERMITE POLYNOMIAL EXPANSION AND MULTILAYER PERCEPTRON ON SYSTEM-ON-CHIP IMPLEMENTATION
ABSTRACT As the number of health issues caused by heart problems is on the rise worldwide, the need for an efficient and portable device for detecting heart arrhythmia is needed. This work proposes a Premature Ventricular Contraction detection system, which is one of the most common arrhythmia, based on Hermite Polynomial Expansion and Artificial Neural Network Algorithm. The algorithm is implemented as a System-On-Chip on Altera DE2-115 FPGA board to form a portable, lightweight and cost effective biomedical embedded system to serve for arrhythmia screening and monitoring purposes. The complete Premature Ventricular Contraction classification computation includes pre-processing, segmentation, morphological information extraction based on Hermite Polynomial Expansion and classification based on artificial Neural Network algorithm. The MIT-BIH Database containing 48 patients' ECG records was used for training and testing purposes and Multilayer Perceptron training is performed using back propagation algorithm. Results show that the algorithm can detect the PVC arrhythmia for 48 different patients with 92.1% accuracy
ECG analysis and classification using CSVM, MSVM and SIMCA classifiers
Reliable ECG classification can potentially lead to better detection methods and increase
accurate diagnosis of arrhythmia, thus improving quality of care. This thesis investigated the
use of two novel classification algorithms: CSVM and SIMCA, and assessed their
performance in classifying ECG beats. The project aimed to introduce a new way to
interactively support patient care in and out of the hospital and develop new classification
algorithms for arrhythmia detection and diagnosis. Wave (P-QRS-T) detection was performed
using the WFDB Software Package and multiresolution wavelets. Fourier and PCs were
selected as time-frequency features in the ECG signal; these provided the input to the
classifiers in the form of DFT and PCA coefficients. ECG beat classification was performed
using binary SVM. MSVM, CSVM, and SIMCA; these were subsequently used for
simultaneously classifying either four or six types of cardiac conditions. Binary SVM
classification with 100% accuracy was achieved when applied on feature-reduced ECG
signals from well-established databases using PCA. The CSVM algorithm and MSVM were
used to classify four ECG beat types: NORMAL, PVC, APC, and FUSION or PFUS; these
were from the MIT-BIH arrhythmia database (precordial lead group and limb lead II).
Different numbers of Fourier coefficients were considered in order to identify the optimal
number of features to be presented to the classifier. SMO was used to compute hyper-plane
parameters and threshold values for both MSVM and CSVM during the classifier training
phase. The best classification accuracy was achieved using fifty Fourier coefficients. With the
new CSVM classifier framework, accuracies of 99%, 100%, 98%, and 99% were obtained
using datasets from one, two, three, and four precordial leads, respectively. In addition, using
CSVM it was possible to successfully classify four types of ECG beat signals extracted from
limb lead simultaneously with 97% accuracy, a significant improvement on the 83% accuracy
achieved using the MSVM classification model. In addition, further analysis of the following
four beat types was made: NORMAL, PVC, SVPB, and FUSION. These signals were
obtained from the European ST-T Database. Accuracies between 86% and 94% were obtained
for MSVM and CSVM classification, respectively, using 100 Fourier coefficients for
reconstructing individual ECG beats. Further analysis presented an effective ECG arrhythmia
classification scheme consisting of PCA as a feature reduction method and a SIMCA
classifier to differentiate between either four or six different types of arrhythmia. In separate
studies, six and four types of beats (including NORMAL, PVC, APC, RBBB, LBBB, and
FUSION beats) with time domain features were extracted from the MIT-BIH arrhythmia
database and the St Petersburg INCART 12-lead Arrhythmia Database (incartdb) respectively.
Between 10 and 30 PCs, coefficients were selected for reconstructing individual ECG beats in
the feature selection phase. The average classification accuracy of the proposed scheme was
98.61% and 97.78 % using the limb lead and precordial lead datasets, respectively. In addition,
using MSVM and SIMCA classifiers with four ECG beat types achieved an average
classification accuracy of 76.83% and 98.33% respectively. The effectiveness of the proposed
algorithms was finally confirmed by successfully classifying both the six beat and four beat
types of signal respectively with a high accuracy ratio
Proper orthogonal decomposition with interpolation-based real-time modelling of the heart
Several studies have been carried out recently with the aim of achieving cardiac modelling of the whole heart for a full heartbeat. However, within the context of the Galerkin method, those simulations require high computational demand, ranging from 16 - 200 CPUs, and long calculation time, lasting from 1 h - 50 h. To solve this problem, this research proposes to make use of a Reduced Order Method (ROM) called the Proper Orthogonal Decomposition with Interpolation method (PODI) to achieve real-time modelling with an adequate level of solution accuracy. The idea behind this method is to first construct a database of pre-computed full-scale solutions using the Element-free Galerkin method (EFG) and then project a selected subset of these solutions to a low dimensional space. Using the Moving Least Square method (MLS), an interpolation is carried out for the problem-at-hand, before the resulting coefficients are projected back to the original high dimensional solution space. The aim of this project is to tackle real-time modelling of a patient-specific heart for a full heartbeat in different stages, namely: modelling (i) the diastolic filling with variations of material properties, (ii) the isovolumetric contraction (IVC), ejection and isovolumetric relation (IVR) with arbitrary time evolutions, and (iii) variations in heart anatomy. For the diastolic filling, computations are carried out on a bi-ventricle model (BV) to investigate the performance and accuracy for varying the material parameters. The PODI calculations of the LV are completed within 14 s on a normal desktop machine with a relative L₂-error norm of 6x10⁻³. These calculations are about 2050 times faster than EFG, with each displacement step generated at a calculation frequency of 1074 Hz. An error sensitivity analysis is consequently carried out to find the most sensitive parameter and optimum dataset to be selected for the PODI calculation. In the second phase of the research, a so-called "time standardisation scheme" is adopted to model a full heartbeat cycle. This is due to the simulation of the IVC, ejection, and IVR phases being carried out using a displacement-driven calculation method which does not use uniform simulation steps across datasets. Generated results are accurate, with the PODI calculations being 2200 faster than EFG. The PODI method is, in the third phase of this work, extended to deal with arbitrary heart meshes by developing a method called "Degrees of freedom standardisation" (DOFS). DOFS consists of using a template mesh over which all dataset result fields are projected. Once the result fields are standardised, they are consequently used for the PODI calculation, before the PODI solution is projected back to the mesh of the problem-at-hand. The first template mesh to be considered is a cube mesh. However, it is found to produce results with high errors and non-physical behaviour. The second template mesh used is a heart template. In this case, a preprocessing step is required where a non-rigid transformation based on the coherent point drift method is used to transform all dataset hearts onto the heart template. The heart template approach generated a PODI solution of higher accuracy at a relatively low computational time. Following these encouraging results, a final investigation is carried out where the PODI method is coupled with a computationally expensive gradient-based optimisation method called the Levenberg- Marquardt (PODI-LVM) method. It is then compared against the full-scale simulation one where the EFG is used with the Levenberg-Marquardt method (EFG-LVM). In this case, the PODI-LVM simulations are 1025 times faster than the EFG-LVM, while its error is less than 1%. It is also observed that since the PODI database is built using EFG simulations, the PODI-LVM behaves similarly to the EFG-LVM one
Learning Biosignals with Deep Learning
The healthcare system, which is ubiquitously recognized as one of the most influential
system in society, is facing new challenges since the start of the decade.The myriad of
physiological data generated by individuals, namely in the healthcare system, is generating
a burden on physicians, losing effectiveness on the collection of patient data. Information
systems and, in particular, novel deep learning (DL) algorithms have been prompting a
way to take this problem.
This thesis has the aim to have an impact in biosignal research and industry by
presenting DL solutions that could empower this field. For this purpose an extensive study
of how to incorporate and implement Convolutional Neural Networks (CNN), Recursive
Neural Networks (RNN) and Fully Connected Networks in biosignal studies is discussed.
Different architecture configurations were explored for signal processing and decision
making and were implemented in three different scenarios: (1) Biosignal learning and
synthesis; (2) Electrocardiogram (ECG) biometric systems, and; (3) Electrocardiogram
(ECG) anomaly detection systems. In (1) a RNN-based architecture was able to replicate
autonomously three types of biosignals with a high degree of confidence. As for (2) three
CNN-based architectures, and a RNN-based architecture (same used in (1)) were used
for both biometric identification, reaching values above 90% for electrode-base datasets
(Fantasia, ECG-ID and MIT-BIH) and 75% for off-person dataset (CYBHi), and biometric
authentication, achieving Equal Error Rates (EER) of near 0% for Fantasia and MIT-BIH
and bellow 4% for CYBHi. As for (3) the abstraction of healthy clean the ECG signal
and detection of its deviation was made and tested in two different scenarios: presence of
noise using autoencoder and fully-connected network (reaching 99% accuracy for binary
classification and 71% for multi-class), and; arrhythmia events by including a RNN to the
previous architecture (57% accuracy and 61% sensitivity).
In sum, these systems are shown to be capable of producing novel results. The incorporation
of several AI systems into one could provide to be the next generation of
preventive medicine, as the machines have access to different physiological and anatomical
states, it could produce more informed solutions for the issues that one may face in the
future increasing the performance of autonomous preventing systems that could be used
in every-day life in remote places where the access to medicine is limited. These systems will also help the study of the signal behaviour and how they are made in real life context
as explainable AI could trigger this perception and link the inner states of a network with
the biological traits.O sistema de saúde, que é ubiquamente reconhecido como um dos sistemas mais influentes
da sociedade, enfrenta novos desafios desde o ínicio da década. A miríade de dados fisiológicos
gerados por indíviduos, nomeadamente no sistema de saúde, está a gerar um fardo
para os médicos, perdendo a eficiência no conjunto dos dados do paciente. Os sistemas de
informação e, mais espcificamente, da inovação de algoritmos de aprendizagem profunda
(DL) têm sido usados na procura de uma solução para este problema.
Esta tese tem o objetivo de ter um impacto na pesquisa e na indústria de biosinais,
apresentando soluções de DL que poderiam melhorar esta área de investigação. Para
esse fim, é discutido um extenso estudo de como incorporar e implementar redes neurais
convolucionais (CNN), redes neurais recursivas (RNN) e redes totalmente conectadas para
o estudo de biosinais.
Diferentes arquiteturas foram exploradas para processamento e tomada de decisão de
sinais e foram implementadas em três cenários diferentes: (1) Aprendizagem e síntese de
biosinais; (2) sistemas biométricos com o uso de eletrocardiograma (ECG), e; (3) Sistema
de detecção de anomalias no ECG. Em (1) uma arquitetura baseada na RNN foi capaz
de replicar autonomamente três tipos de sinais biológicos com um alto grau de confiança.
Quanto a (2) três arquiteturas baseadas em CNN e uma arquitetura baseada em RNN
(a mesma usada em (1)) foram usadas para ambas as identificações, atingindo valores
acima de 90 % para conjuntos de dados à base de eletrodos (Fantasia, ECG-ID e MIT
-BIH) e 75 % para o conjunto de dados fora da pessoa (CYBHi) e autenticação, atingindo
taxas de erro iguais (EER) de quase 0 % para Fantasia e MIT-BIH e abaixo de 4 % para
CYBHi. Quanto a (3) a abstração de sinais limpos e assimptomáticos de ECG e a detecção
do seu desvio foram feitas e testadas em dois cenários diferentes: na presença de ruído
usando um autocodificador e uma rede totalmente conectada (atingindo 99 % de precisão
na classificação binária e 71 % na multi-classe), e; eventos de arritmia incluindo um RNN
na arquitetura anterior (57 % de precisão e 61 % de sensibilidade).
Em suma, esses sistemas são mais uma vez demonstrados como capazes de produzir
resultados inovadores. A incorporação de vários sistemas de inteligência artificial em
um unico sistema pederá desencadear a próxima geração de medicina preventiva. Os
algoritmos ao terem acesso a diferentes estados fisiológicos e anatómicos, podem produzir
soluções mais informadas para os problemas que se possam enfrentar no futuro, aumentando o desempenho de sistemas autónomos de prevenção que poderiam ser usados na vida
quotidiana, nomeadamente em locais remotos onde o acesso à medicinas é limitado. Estes
sistemas também ajudarão o estudo do comportamento do sinal e como eles são feitos no
contexto da vida real, pois a IA explicável pode desencadear essa percepção e vincular os
estados internos de uma rede às características biológicas
19th Conference of The Associations of Christians In The Mathematical Sciences
Association of Christians in the Mathematical Sciences 19th Biennial Conference Proceedings, May 29 - June 1, 2011, Bethel University
SPICA:revealing the hearts of galaxies and forming planetary systems : approach and US contributions
How did the diversity of galaxies we see in the modern Universe come to be? When and where did stars within them forge the heavy elements that give rise to the complex chemistry of life? How do planetary systems, the Universe's home for life, emerge from interstellar material? Answering these questions requires techniques that penetrate dust to reveal the detailed contents and processes in obscured regions. The ESA-JAXA Space Infrared Telescope for Cosmology and Astrophysics (SPICA) mission is designed for this, with a focus on sensitive spectroscopy in the 12 to 230 micron range. SPICA offers massive sensitivity improvements with its 2.5-meter primary mirror actively cooled to below 8 K. SPICA one of 3 candidates for the ESA's Cosmic Visions M5 mission, and JAXA has is committed to their portion of the collaboration. ESA will provide the silicon-carbide telescope, science instrument assembly, satellite integration and testing, and the spacecraft bus. JAXA will provide the passive and active cooling system (supporting the
The Apertif Surveys:The First Six Months
Apertif is a new phased-array feed for the Westerbork Synthesis Radio Telescope (WSRT), greatly increasing its field of view and turning it into a natural survey instrument. In July 2019, the Apertif legacy surveys commenced; these are a time-domain survey and a two-tiered imaging survey, with a shallow and medium-deep component. The time-domain survey searches for new (millisecond) pulsars and fast radio bursts (FRBs). The imaging surveys provide neutral hydrogen (HI), radio continuum and polarization data products. With a bandwidth of 300 MHz, Apertif can detect HI out to a redshift of 0.26. The key science goals to be accomplished by Apertif include localization of FRBs (including real-time public alerts), the role of environment and interaction on galaxy properties and gas removal, finding the smallest galaxies, connecting cold gas to AGN, understanding the faint radio population, and studying magnetic fields in galaxies. After a proprietary period, survey data products will be publicly available through the Apertif Long Term Archive (ALTA, https://alta.astron.nl). I will review the progress of the surveys and present the first results from the Apertif surveys, including highlighting the currently available public data