32 research outputs found
Identification of myocardial infarction using consumer smartwatch ECG measurement
The goal of this thesis is to detect and classify acute myocardial infarctions from smartwatch ECG data. As the smartwatches have been increasing in numbers, and many of new smartwatch models have capability to detect ECG data. This study aims to answer to the question whether or not the ECG data from smartwatches can be used to detect acute myocardial infarctions.
To answer to this question, and existing database has been used in tandem with smartwatch ECG data gathered from two different smartwatches. Five different machine learning models have been used to detect and classify ECG data. The best performing machine learning model was Extra Trees, which achieved accuracy of 90.84% with using Leave-One-Out Cross-Validation.
These results show that ECG data from smartwatches could be used to detect infarctions. Measuring ECG with smartwatch is much easier than using clinical ECG measurement devices, meaning that ECG measuring could reach much wider audience that it has prior to this been able to reach.
Further research could include gathering larger database from smartwatch ECG, and the data ownership of smartwatch, and other medical and biological data that companies collect
A primer in artificial intelligence in cardiovascular medicine
Driven by recent developments in computational power, algorithms and web-based storage resources, machine learning (ML)-based artificial intelligence (AI) has quickly gained ground as the solution for many technological and societal challenges. AI education has become very popular and is oversubscribed at Dutch universities. Major investments were made in 2018 to develop and build the first AI-driven hospitals to improve patient care and reduce healthcare costs. AI has the potential to greatly enhance traditional statistical analyses in many domains and has been demonstrated to allow the discovery of 'hidden' information in highly complex datasets. As such, AI can also be of significant value in the diagnosis and treatment of cardiovascular disease, and the first applications of AI in the cardiovascular field are promising. However, many professionals in the cardiovascular field involved in patient care, education or science are unaware of the basics behind AI and the existing and expected applications in their field. In this review, we aim to introduce the broad cardiovascular community to the basics of modern ML-based AI and explain several of the commonly used algorithms. We also summarise their initial and future applications relevant to the cardiovascular field
Novelty, distillation, and federation in machine learning for medical imaging
The practical application of deep learning methods in the medical domain
has many challenges. Pathologies are diverse and very few examples may
be available for rare cases. Where data is collected it may lie in multiple
institutions and cannot be pooled for practical and ethical reasons. Deep
learning is powerful for image segmentation problems but ultimately its output
must be interpretable at the patient level. Although clearly not an exhaustive
list, these are the three problems tackled in this thesis.
To address the rarity of pathology I investigate novelty detection algorithms
to find outliers from normal anatomy. The problem is structured as first finding
a low-dimension embedding and then detecting outliers in that embedding
space. I evaluate for speed and accuracy several unsupervised embedding and
outlier detection methods. Data consist of Magnetic Resonance Imaging (MRI)
for interstitial lung disease for which healthy and pathological patches are
available; only the healthy patches are used in model training.
I then explore the clinical interpretability of a model output. I take related
work by the Canon team — a model providing voxel-level detection of acute
ischemic stroke signs — and deliver the Alberta Stroke Programme Early CT
Score (ASPECTS, a measure of stroke severity). The data are acute head
computed tomography volumes of suspected stroke patients. I convert from
the voxel level to the brain region level and then to the patient level through a
series of rules. Due to the real world clinical complexity of the problem, there
are at each level — voxel, region and patient — multiple sources of “truth”; I
evaluate my results appropriately against these truths.
Finally, federated learning is used to train a model on data that are divided
between multiple institutions. I introduce a novel evolution of this algorithm
— dubbed “soft federated learning” — that avoids the central coordinating
authority, and takes into account domain shift (covariate shift) and dataset
size. I first demonstrate the key properties of these two algorithms on a series
of MNIST (handwritten digits) toy problems. Then I apply the methods to the
BraTS medical dataset, which contains MRI brain glioma scans from multiple
institutions, to compare these algorithms in a realistic setting
The Application of Computer Techniques to ECG Interpretation
This book presents some of the latest available information on automated ECG analysis written by many of the leading researchers in the field. It contains a historical introduction, an outline of the latest international standards for signal processing and communications and then an exciting variety of studies on electrophysiological modelling, ECG Imaging, artificial intelligence applied to resting and ambulatory ECGs, body surface mapping, big data in ECG based prediction, enhanced reliability of patient monitoring, and atrial abnormalities on the ECG. It provides an extremely valuable contribution to the field
Recommended from our members
Depth Analysis of Anesthesia Using EEG Signals via Time Series Feature Extraction and Machine Learning
Data Availability Statement: Data presented in the paper are available on request from the corresponding author J.-S.S.Copyright © 2023 by the authors. The term “anesthetic depth” refers to the extent to which a general anesthetic agent sedates the central nervous system with specific strength concentration at which it is delivered. The depth level of anesthesia plays a crucial role in determining surgical complications, and it is imperative to keep the depth levels of anesthesia under control to perform a successful surgery. This study used electroencephalography (EEG) signals to predict the depth levels of anesthesia. Traditional preprocessing methods such as signal decomposition and model building using deep learning were used to classify anesthetic depth levels. This paper proposed a novel approach to classify the anesthesia levels based on the concept of time series feature extraction, by finding out the relation between EEG signals and the bi-spectral Index over a period of time. Time series feature extraction on basis of scalable hypothesis tests were performed to extract features by analyzing the relation between the EEG signals and Bi-Spectral Index, and machine learning models such as support vector classifier, XG boost classifier, gradient boost classifier, decision trees and random forest classifier are used to train the features and predict the depth level of anesthesia. The best-trained model was random forest, which gives an accuracy of 83%. This provides a platform to further research and dig into time series-based feature extraction in this area.Ministry of Science and Technology, Taiwan (grant number: MOST 107-2221-E-155-009-MY2)
Systematic evaluation of immune regulation and modulation
Cancer immunotherapies are showing promising clinical results in a variety of malignancies. Monitoring the immune as well as the tumor response following these therapies has led to significant advancements in the field. Moreover, the identification and assessment of both predictive and prognostic biomarkers has become a key component to advancing these therapies. Thus, it is critical to develop systematic approaches to monitor the immune response and to interpret the data obtained from these assays. In order to address these issues and make recommendations to the field, the Society for Immunotherapy of Cancer reconvened the Immune Biomarkers Task Force. As a part of this Task Force, Working Group 3 (WG3) consisting of multidisciplinary experts from industry, academia, and government focused on the systematic assessment of immune regulation and modulation. In this review, the tumor microenvironment, microbiome, bone marrow, and adoptively transferred T cells will be used as examples to discuss the type and timing of sample collection. In addition, potential types of measurements, assays, and analyses will be discussed for each sample. Specifically, these recommendations will focus on the unique collection and assay requirements for the analysis of various samples as well as the high-throughput assays to evaluate potential biomarkers
Big data analytics for preventive medicine
© 2019, Springer-Verlag London Ltd., part of Springer Nature. Medical data is one of the most rewarding and yet most complicated data to analyze. How can healthcare providers use modern data analytics tools and technologies to analyze and create value from complex data? Data analytics, with its promise to efficiently discover valuable pattern by analyzing large amount of unstructured, heterogeneous, non-standard and incomplete healthcare data. It does not only forecast but also helps in decision making and is increasingly noticed as breakthrough in ongoing advancement with the goal is to improve the quality of patient care and reduces the healthcare cost. The aim of this study is to provide a comprehensive and structured overview of extensive research on the advancement of data analytics methods for disease prevention. This review first introduces disease prevention and its challenges followed by traditional prevention methodologies. We summarize state-of-the-art data analytics algorithms used for classification of disease, clustering (unusually high incidence of a particular disease), anomalies detection (detection of disease) and association as well as their respective advantages, drawbacks and guidelines for selection of specific model followed by discussion on recent development and successful application of disease prevention methods. The article concludes with open research challenges and recommendations