560 research outputs found

    Automated myocardial infarction diagnosis from ECG

    Get PDF
    In the present dissertation, an automated neural network-based ECG diagnosing system was designed to detect the presence of myocardial infarction based on the hypothesis that an artificial neural network-based ECG interpretation system may improve the clinical myocardial infarction. 137 patients were included. Among them 122 had myocardial infarction, but the remaining 15 were normal. The sensitivity and the specificity of present system were 92.2% and 50.7% respectively. The sensitivity was consistent with relevant research. The relatively low specificity results from the rippling of the low pass filtering. We can conclude that neural network-based system is a promising aid for the myocardial infarction diagnosis

    Nonlinear Stochastic Modeling and Analysis of Cardiovascular System Dynamics - Diagnostic and Prognostic Applications

    Get PDF
    The purpose of this investigation is to develop monitoring, diagnostic and prognostic schemes for cardiovascular diseases by studying the nonlinear stochastic dynamics underlying complex heart system. The employment of a nonlinear stochastic analysis combined with wavelet representations can extract effective cardiovascular features, which will be more sensitive to the pathological dynamics instead of the extraneous noises. While conventional statistical and linear systemic approaches have limitations for capturing signal variations resulting from changes in the cardiovascular system states. The research methodology includes signal representation using optimal wavelet function design, feature extraction using nonlinear recurrence analysis, and local recurrence modeling for state prediction.Industrial Engineering & Managemen

    The Application of Computer Techniques to ECG Interpretation

    Get PDF
    This book presents some of the latest available information on automated ECG analysis written by many of the leading researchers in the field. It contains a historical introduction, an outline of the latest international standards for signal processing and communications and then an exciting variety of studies on electrophysiological modelling, ECG Imaging, artificial intelligence applied to resting and ambulatory ECGs, body surface mapping, big data in ECG based prediction, enhanced reliability of patient monitoring, and atrial abnormalities on the ECG. It provides an extremely valuable contribution to the field

    An artificial neural network model for the prediction of child physical abuse recurrences

    Get PDF
    All 50 states have passed some form of mandatory reporting law to qualify for funding under the Child Abuse Prevention and Treatment act of 1974 (P.L. 93-247). Consequently, child protective service (CPS) agencies have experienced a dramatic increase in reports of abuse and neglect without corresponding increases in funding over the past several years. In response, many CPS agencies have turned to formal risk assessment systems to aid caseworker in making various decisions. Various methodological obstacles have impeded efforts to predict child abuse. The present study explored the potential of an artificial neural network to improve prediction of recurrences of child physical abuse. Conducted on electronic data file compiled by the U.S. Air Force\u27s central registry of child abuse reports, selected variables pertaining to all child physical abuse reports received from 1990-2000 (N=5612) were examined. Thirteen predictor variables and five interaction terms were identified for analysis. It was concluded that both BLR and ANNs offer powerful tools to be used in future efforts to build abuse prediction models. When applied to the present data, BLR was more useful

    Deep Risk Prediction and Embedding of Patient Data: Application to Acute Gastrointestinal Bleeding

    Get PDF
    Acute gastrointestinal bleeding is a common and costly condition, accounting for over 2.2 million hospital days and 19.2 billion dollars of medical charges annually. Risk stratification is a critical part of initial assessment of patients with acute gastrointestinal bleeding. Although all national and international guidelines recommend the use of risk-assessment scoring systems, they are not commonly used in practice, have sub-optimal performance, may be applied incorrectly, and are not easily updated. With the advent of widespread electronic health record adoption, longitudinal clinical data captured during the clinical encounter is now available. However, this data is often noisy, sparse, and heterogeneous. Unsupervised machine learning algorithms may be able to identify structure within electronic health record data while accounting for key issues with the data generation process: measurements missing-not-at-random and information captured in unstructured clinical note text. Deep learning tools can create electronic health record-based models that perform better than clinical risk scores for gastrointestinal bleeding and are well-suited for learning from new data. Furthermore, these models can be used to predict risk trajectories over time, leveraging the longitudinal nature of the electronic health record. The foundation of creating relevant tools is the definition of a relevant outcome measure; in acute gastrointestinal bleeding, a composite outcome of red blood cell transfusion, hemostatic intervention, and all-cause 30-day mortality is a relevant, actionable outcome that reflects the need for hospital-based intervention. However, epidemiological trends may affect the relevance and effectiveness of the outcome measure when applied across multiple settings and patient populations. Understanding the trends in practice, potential areas of disparities, and value proposition for using risk stratification in patients presenting to the Emergency Department with acute gastrointestinal bleeding is important in understanding how to best implement a robust, generalizable risk stratification tool. Key findings include a decrease in the rate of red blood cell transfusion since 2014 and disparities in access to upper endoscopy for patients with upper gastrointestinal bleeding by race/ethnicity across urban and rural hospitals. Projected accumulated savings of consistent implementation of risk stratification tools for upper gastrointestinal bleeding total approximately $1 billion 5 years after implementation. Most current risk scores were designed for use based on the location of the bleeding source: upper or lower gastrointestinal tract. However, the location of the bleeding source is not always clear at presentation. I develop and validate electronic health record based deep learning and machine learning tools for patients presenting with symptoms of acute gastrointestinal bleeding (e.g., hematemesis, melena, hematochezia), which is more relevant and useful in clinical practice. I show that they outperform leading clinical risk scores for upper and lower gastrointestinal bleeding, the Glasgow Blatchford Score and the Oakland score. While the best performing gradient boosted decision tree model has equivalent overall performance to the fully connected feedforward neural network model, at the very low risk threshold of 99% sensitivity the deep learning model identifies more very low risk patients. Using another deep learning model that can model longitudinal risk, the long-short-term memory recurrent neural network, need for transfusion of red blood cells can be predicted at every 4-hour interval in the first 24 hours of intensive care unit stay for high risk patients with acute gastrointestinal bleeding. Finally, for implementation it is important to find patients with symptoms of acute gastrointestinal bleeding in real time and characterize patients by risk using available data in the electronic health record. A decision rule-based electronic health record phenotype has equivalent performance as measured by positive predictive value compared to deep learning and natural language processing-based models, and after live implementation appears to have increased the use of the Acute Gastrointestinal Bleeding Clinical Care pathway. Patients with acute gastrointestinal bleeding but with other groups of disease concepts can be differentiated by directly mapping unstructured clinical text to a common ontology and treating the vector of concepts as signals on a knowledge graph; these patients can be differentiated using unbalanced diffusion earth mover’s distances on the graph. For electronic health record data with data missing not at random, MURAL, an unsupervised random forest-based method, handles data with missing values and generates visualizations that characterize patients with gastrointestinal bleeding. This thesis forms a basis for understanding the potential for machine learning and deep learning tools to characterize risk for patients with acute gastrointestinal bleeding. In the future, these tools may be critical in implementing integrated risk assessment to keep low risk patients out of the hospital and guide resuscitation and timely endoscopic procedures for patients at higher risk for clinical decompensation

    What is the added value of using non-linear models to explore complex healthcare datasets?

    Get PDF
    Health care is a complex system and it is therefore expected to behave in a non-linear manner. It is important for the delivery of health interventions to patients that the best possible analysis of available data is undertaken. Many of the conventional models used for health care data are linear. This research compares the performance of linear models with non-linear models for two health care data sets of complex interventions. Logistic regression, latent class analysis and a classification artificial neural network were each used to model outcomes for patients using data from a randomised controlled trial of a cognitive behavioural complex intervention for non-specific low back pain. A Cox proportional hazards model and an artificial neural network were used to model survival and the hazards for different sub-groups of patients using an observational study of a cardiovascular rehabilitation complex intervention. The artificial neural network and an ordinary logistic regression were more accurate in classifying patient recovery from back pain than a logistic regression on latent class membership. The most sensitive models were the artificial neural network and the latent class logistic regression. The best overall performance was the artificial neural network, providing both sensitivity and accuracy. Survival was modelled equally well by the Cox model and the artificial neural network, when compared to the empirical Kaplan-Meier survival curve. Long term survival for the cardiovascular patients was strongly associated with secondary prevention medications, and fitness was also important. Moreover, improvement in fitness during the rehabilitation period to a fairly modest 'high fitness' category was as advantageous for long-term survival as having achieved that same level of fitness by the beginning of the rehabilitation period. Having adjusted for fitness, BMI was not a predictor of long term survival after a cardiac event or procedure. The Cox proportional hazards model was constrained by its assumptions to produce hazard trajectories proportional to the baseline hazard. The artificial neural network model produced hazard trajectories that vary, giving rise to hypotheses about how the predictors of survival interact in their influence on the hazard. The artificial neural network, an exemplar non-linear model, has been shown to match or exceed the capability of conventional models in the analysis of complex health care data sets

    Feature Selection and Non-Euclidean Dimensionality Reduction: Application to Electrocardiology.

    Full text link
    Heart disease has been the leading cause of human death for decades. To improve treatment of heart disease, algorithms to perform reliable computer diagnosis using electrocardiogram (ECG) data have become an area of active research. This thesis utilizes well-established methods from cluster analysis, classification, and localization to cluster and classify ECG data, and aims to help clinicians diagnose and treat heart diseases. The power of these methods is enhanced by state-of-the-art feature selection and dimensionality reduction. The specific contributions of this thesis are as follows. First, a unique combination of ECG feature selection and mixture model clustering is introduced to classify the sites of origin of ventricular tachycardias. Second, we apply a restricted Boltzmann machine (RBM) to learn sparse representations of ECG signals and to build an enriched classifier from patient data. Third, a novel manifold learning algorithm is introduced, called Quaternion Laplacian Information Maps (QLIM), and is applied to visualize high-dimensional ECG signals. These methods are applied to design of an automated supervised classification algorithm to help a physician identify the origin of ventricular arrhythmias (VA) directed from a patient's ECG data. The algorithm is trained on a large database of ECGs and catheter positions collected during the electrophysiology (EP) pace-mapping procedures. The proposed algorithm is demonstrated to have a correct classification rate of over 80% for the difficult task of classifying VAs having epicardial or endocardial origins.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113303/1/dyjung_1.pd

    Strategies for neural networks in ballistocardiography with a view towards hardware implementation

    Get PDF
    A thesis submitted for the degree of Doctor of Philosophy at the University of LutonThe work described in this thesis is based on the results of a clinical trial conducted by the research team at the Medical Informatics Unit of the University of Cambridge, which show that the Ballistocardiogram (BCG) has prognostic value in detecting impaired left ventricular function before it becomes clinically overt as myocardial infarction leading to sudden death. The objective of this study is to develop and demonstrate a framework for realising an on-line BCG signal classification model in a portable device that would have the potential to find pathological signs as early as possible for home health care. Two new on-line automatic BeG classification models for time domain BeG classification are proposed. Both systems are based on a two stage process: input feature extraction followed by a neural classifier. One system uses a principal component analysis neural network, and the other a discrete wavelet transform, to reduce the input dimensionality. Results of the classification, dimensionality reduction, and comparison are presented. It is indicated that the combined wavelet transform and MLP system has a more reliable performance than the combined neural networks system, in situations where the data available to determine the network parameters is limited. Moreover, the wavelet transfonn requires no prior knowledge of the statistical distribution of data samples and the computation complexity and training time are reduced. Overall, a methodology for realising an automatic BeG classification system for a portable instrument is presented. A fully paralJel neural network design for a low cost platform using field programmable gate arrays (Xilinx's XC4000 series) is explored. This addresses the potential speed requirements in the biomedical signal processing field. It also demonstrates a flexible hardware design approach so that an instrument's parameters can be updated as data expands with time. To reduce the hardware design complexity and to increase the system performance, a hybrid learning algorithm using random optimisation and the backpropagation rule is developed to achieve an efficient weight update mechanism in low weight precision learning. The simulation results show that the hybrid learning algorithm is effective in solving the network paralysis problem and the convergence is much faster than by the standard backpropagation rule. The hidden and output layer nodes have been mapped on Xilinx FPGAs with automatic placement and routing tools. The static time analysis results suggests that the proposed network implementation could generate 2.7 billion connections per second performance
    • …
    corecore