2,168 research outputs found

    ALEC: Active learning with ensemble of classifiers for clinical diagnosis of coronary artery disease

    Get PDF
    Invasive angiography is the reference standard for coronary artery disease (CAD) diagnosis but is expensive and associated with certain risks. Machine learning (ML) using clinical and noninvasive imaging parameters can be used for CAD diagnosis to avoid the side effects and cost of angiography. However, ML methods require labeled samples for efficient training. The labeled data scarcity and high labeling costs can be mitigated by active learning. This is achieved through selective query of challenging samples for labeling. To the best of our knowledge, active learning has not been used for CAD diagnosis yet. An Active Learning with Ensemble of Classifiers (ALEC) method is proposed for CAD diagnosis, consisting of four classifiers. Three of these classifiers determine whether a patient’s three main coronary arteries are stenotic or not. The fourth classifier predicts whether the patient has CAD or not. ALEC is first trained using labeled samples. For each unlabeled sample, if the outputs of the classifiers are consistent, the sample along with its predicted label is added to the pool of labeled samples. Inconsistent samples are manually labeled by medical experts before being added to the pool. The training is performed once more using the samples labeled so far. The interleaved phases of labeling and training are repeated until all samples are labeled. Compared with 19 other active learning algorithms, ALEC combined with a support vector machine classifier attained superior performance with 97.01% accuracy. Our method is justified mathematically as well. We also comprehensively analyze the CAD dataset used in this paper. As part of dataset analysis, features pairwise correlation is computed. The top 15 features contributing to CAD and stenosis of the three main coronary arteries are determined. The relationship between stenosis of the main arteries is presented using conditional probabilities. The effect of considering the number of stenotic arteries on sample discrimination is investigated. The discrimination power over dataset samples is visualized, assuming each of the three main coronary arteries as a sample label and considering the two remaining arteries as sample features

    Unsupervised Similarity-Based Risk Stratification for Cardiovascular Events Using Long-Term Time-Series Data

    Get PDF
    In medicine, one often bases decisions upon a comparative analysis of patient data. In this paper, we build upon this observation and describe similarity-based algorithms to risk stratify patients for major adverse cardiac events. We evolve the traditional approach of comparing patient data in two ways. First, we propose similarity-based algorithms that compare patients in terms of their long-term physiological monitoring data. Symbolic mismatch identifies functional units in long-term signals and measures changes in the morphology and frequency of these units across patients. Second, we describe similarity-based algorithms that are unsupervised and do not require comparisons to patients with known outcomes for risk stratification. This is achieved by using an anomaly detection framework to identify patients who are unlike other patients in a population and may potentially be at an elevated risk. We demonstrate the potential utility of our approach by showing how symbolic mismatch-based algorithms can be used to classify patients as being at high or low risk of major adverse cardiac events by comparing their long-term electrocardiograms to that of a large population. We describe how symbolic mismatch can be used in three different existing methods: one-class support vector machines, nearest neighbor analysis, and hierarchical clustering. When evaluated on a population of 686 patients with available long-term electrocardiographic data, symbolic mismatch-based comparative approaches were able to identify patients at roughly a two-fold increased risk of major adverse cardiac events in the 90 days following acute coronary syndrome. These results were consistent even after adjusting for other clinical risk variables.National Science Foundation (U.S.) (CAREER award 1054419

    Application of Dual-Energy Computed Tomography to the Evalution of Coronary Atherosclerotic Plaque

    Get PDF
    Atherosclerotic coronary artery disease is responsible for around 50 of cardiovascular deaths in USA. Early detection and characterization of coronary artery atherosclerotic plaque could help prevent cardiac events. Computed tomography (CT) is an excellent modality for imaging calcifications and has higher spatial resolution than other common non-invasive modalities (e.g MRI), making it more suitable for coronary plaque detection. However, attenuation-based classification of non-calcified plaques as fibrous or lipid is difficult with conventional CT, which relies on a single x-ray energy. Dual-energy CT (DECT) may provide additional attenuation data for the identification and discrimination of plaque components. The purpose of this research was to evaluate the feasibility of DECT imaging for coronary plaque characterization and further, to explore the limits of CT for non-invasive plaque analysis. DECT techniques were applied to plaque classification using a clinical CT system. Saline perfused coronary arteries from autopsies were scanned at 80 and 140 kVp, prior to and during injection of iodinated contrast. Plaque attenuation was measured from CT images and matched to histology. Measurements were compared to assess differences among plaque types. Although calcified and non-calcified plaques could be identified and differentiated with DECT, further characterization of non-calcified plaques was not possible. The results also demonstrated that calcified plaque and iodine could be discriminated. The limits of x-ray based non-calcified plaque discrimination were assessed using microCT, a pre-clinical x-ray based high spatial resolution modality. Phantoms and tissues of different composition were scanned using different tube voltages (i.e., different energies) and resulting attenuation values were compared. Better vessel wall visualization and increase in tissue contrast resolution was observed with decrease in x-ray energy. Feasibility of calcium quantification from contrast-enhanced scans by creating virtual n

    Application of Dual-Energy Computed Tomography to the Evalution of Coronary Atherosclerotic Plaque

    Get PDF
    Atherosclerotic coronary artery disease is responsible for around 50 of cardiovascular deaths in USA. Early detection and characterization of coronary artery atherosclerotic plaque could help prevent cardiac events. Computed tomography (CT) is an excellent modality for imaging calcifications and has higher spatial resolution than other common non-invasive modalities (e.g MRI), making it more suitable for coronary plaque detection. However, attenuation-based classification of non-calcified plaques as fibrous or lipid is difficult with conventional CT, which relies on a single x-ray energy. Dual-energy CT (DECT) may provide additional attenuation data for the identification and discrimination of plaque components. The purpose of this research was to evaluate the feasibility of DECT imaging for coronary plaque characterization and further, to explore the limits of CT for non-invasive plaque analysis. DECT techniques were applied to plaque classification using a clinical CT system. Saline perfused coronary arteries from autopsies were scanned at 80 and 140 kVp, prior to and during injection of iodinated contrast. Plaque attenuation was measured from CT images and matched to histology. Measurements were compared to assess differences among plaque types. Although calcified and non-calcified plaques could be identified and differentiated with DECT, further characterization of non-calcified plaques was not possible. The results also demonstrated that calcified plaque and iodine could be discriminated. The limits of x-ray based non-calcified plaque discrimination were assessed using microCT, a pre-clinical x-ray based high spatial resolution modality. Phantoms and tissues of different composition were scanned using different tube voltages (i.e., different energies) and resulting attenuation values were compared. Better vessel wall visualization and increase in tissue contrast resolution was observed with decrease in x-ray energy. Feasibility of calcium quantification from contrast-enhanced scans by creating virtual n

    Multiple Shape Registration using Constrained Optimal Control

    Get PDF
    Lagrangian particle formulations of the large deformation diffeomorphic metric mapping algorithm (LDDMM) only allow for the study of a single shape. In this paper, we introduce and discuss both a theoretical and practical setting for the simultaneous study of multiple shapes that are either stitched to one another or slide along a submanifold. The method is described within the optimal control formalism, and optimality conditions are given, together with the equations that are needed to implement augmented Lagrangian methods. Experimental results are provided for stitched and sliding surfaces

    A CNN-LSTM for predicting mortality in the ICU

    Get PDF
    An accurate predicted mortality is crucial to healthcare as it provides an empirical risk estimate for prognostic decision making, patient stratification and hospital benchmarking. Current prediction methods in practice are severity of disease scoring systems that usually involve a fixed set of admission attributes and summarized physiological data. These systems are prone to bias and require substantial manual effort which necessitates an updated approach which can account for most shortcomings. Clinical observation notes allow for recording highly subjective data on the patient that can possibly facilitate higher discrimination. Moreover, deep learning models can automatically extract and select features without human input.This thesis investigates the potential of a combination of a deep learning model and notes for predicting mortality with a higher accuracy. A custom architecture, called CNN-LSTM, is conceptualized for mapping multiple notes compiled in a hospital stay to a mortality outcome. It employs both convolutional and recurrent layers with the former capturing semantic relationships in individual notes independently and the latter capturing temporal relationships between concurrent notes in a hospital stay. This approach is compared to three severity of disease scoring systems with a case study on the MIMIC-III dataset. Experiments are set up to assess the CNN-LSTM for predicting mortality using only the notes from the first 24, 12 and 48 hours of a patient stay. The model is trained using K-fold cross-validation with k=5 and the mortality probability calculated by the three severity scores on the held-out set is used as the baseline. It is found that the CNN-LSTM outperforms the baseline on all experiments which serves as a proof-of-concept of how notes and deep learning can better outcome prediction
    • …
    corecore