36 research outputs found

    Decision fusion in healthcare and medicine : a narrative review

    Get PDF
    Objective: To provide an overview of the decision fusion (DF) technique and describe the applications of the technique in healthcare and medicine at prevention, diagnosis, treatment and administrative levels. Background: The rapid development of technology over the past 20 years has led to an explosion in data growth in various industries, like healthcare. Big data analysis within the healthcare systems is essential for arriving to a value-based decision over a period of time. Diversity and uncertainty in big data analytics have made it impossible to analyze data by using conventional data mining techniques and thus alternative solutions are required. DF is a form of data fusion techniques that could increase the accuracy of diagnosis and facilitate interpretation, summarization and sharing of information. Methods: We conducted a review of articles published between January 1980 and December 2020 from various databases such as Google Scholar, IEEE, PubMed, Science Direct, Scopus and web of science using the keywords decision fusion (DF), information fusion, healthcare, medicine and big data. A total of 141 articles were included in this narrative review. Conclusions: Given the importance of big data analysis in reducing costs and improving the quality of healthcare; along with the potential role of DF in big data analysis, it is recommended to know the full potential of this technique including the advantages, challenges and applications of the technique before its use. Future studies should focus on describing the methodology and types of data used for its applications within the healthcare sector

    Fusion of Higher Order Spectra and Texture Extraction Methods for Automated Stroke Severity Classification with MRI Images

    Get PDF
    This paper presents a scientific foundation for automated stroke severity classification. We have constructed and assessed a system which extracts diagnostically relevant information from Magnetic Resonance Imaging (MRI) images. The design was based on 267 images that show the brain from individual subjects after stroke. They were labeled as either Lacunar Syndrome (LACS), Partial Anterior Circulation Syndrome (PACS), or Total Anterior Circulation Stroke (TACS). The labels indicate different physiological processes which manifest themselves in distinct image texture. The processing system was tasked with extracting texture information that could be used to classify a brain MRI image from a stroke survivor into either LACS, PACS, or TACS. We analyzed 6475 features that were obtained with Gray-Level Run Length Matrix (GLRLM), Higher Order Spectra (HOS), as well as a combination of Discrete Wavelet Transform (DWT) and Gray-Level Co-occurrence Matrix (GLCM) methods. The resulting features were ranked based on the p-value extracted with the Analysis Of Variance (ANOVA) algorithm. The ranked features were used to train and test four types of Support Vector Machine (SVM) classification algorithms according to the rules of 10-fold cross-validation. We found that SVM with Radial Basis Function (RBF) kernel achieves: Accuracy (ACC) = 93.62%, Specificity (SPE) = 95.91%, Sensitivity (SEN) = 92.44%, and Dice-score = 0.95. These results indicate that computer aided stroke severity diagnosis support is possible. Such systems might lead to progress in stroke diagnosis by enabling healthcare professionals to improve diagnosis and management of stroke patients with the same resources

    The matter of white and gray matter in cognitive impairment

    Get PDF
    Cognitive impairment spans from minor subjective cognitive impairment to disabling dementia. Many biomarkers have been developed to monitor different aspects of cognitive impairment. Magnetic resonance imaging is the most used neuroimaging biomarker in research and can measure gray matter (GM) and white matter (WM) changes. Although there is a consensus that atrophy in GM is a marker for neuronal loss, there is little evidence assessing the role of WM changes. The aim of this thesis is to first develop a tool to reliably measure the changes in WM in the form of white matter hyperintensities (WMH) and second to evaluate the role of WM and GM changes in the early stages of cognitive decline. In Study I and Study II, a fully automated method for segmentation of WMH has been developed and validated. Validation results indicated that the WMH segmentation was performed with high similarity to manual delineation and with superb reproducibility. In Study III, coronary heart disease (CHD) and hypertension, which are known to contribute to WM damage, were examined and their effect on GM and WM changes was investigated on a group of 69 individuals with 30-year follow-up. We showed that CHD and hypertension indeed affect the GM volume and thickness and the effect of CHD is partially independent of hypertension. However, the results indicate no significant effect on WMH, which we believe is due to the fact that WMH were measured as a crude total volume. In Study IV, a pipeline was developed to isolate the WM tract connecting each GM region to the rest of the brain and to measure the burden of WMH on each tract, hereinafter tractbased WMH. We used a cohort of 257 cognitively normal (CTL), 87 subjective cognitive impairment (SCI) and 124 mild cognitive impairment (MCI) subjects and examined their GM volume, tract-based WMH and cognitive performance. Our results indicated that the fraction of variance in GM volume that can be explained by tract-based WMH in SCI subjects is significantly higher than in both CTL and MCI subjects. The results also showed that in subjects with high and low cognitive performance, tract-based WMH can barely explain any GM volume change. However, in subjects with slight cognitive impairment tract-based WMH can explain the changes in GM volume. In summary, we investigated different ways of measuring the damage of WMH and showed that the role of WMH is more pronounced when measuring them in relation to the WM tract they affect. The effect of WMH on GM has been shown to be mainly in the earlier stages of cognitive impairment

    Computed tomography image analysis for the detection of obstructive lung diseases

    Get PDF
    Damage to the small airways resulting from direct lung injury or associated with many systemic disorders is not easy to identify. Non-invasive techniques such as chest radiography or conventional tests of lung function often cannot reveal the pathology. On Computed Tomography (CT) images, the signs suggesting the presence of obstructive airways disease are subtle, and inter- and intra-observer variability can be considerable. The goal of this research was to implement a system for the automated analysis of CT data of the lungs. Its function is to help clinicians establish a confident assessment of specific obstructive airways diseases and increase the precision of investigation of structure/function relationships. To help resolve the ambiguities of the CT scans, the main objectives of our system were to provide a functional description of the raster images, extract semi-quantitative measurements of the extent of obstructive airways disease and propose a clinical diagnosis aid using a priori knowledge of CT image features of the diseased lungs. The diagnostic process presented in this thesis involves the extraction and analysis of multiple findings. Several novel low-level computer vision feature extractors and image processing algorithms were developed for extracting the extent of the hypo-attenuated areas, textural characterisation of the lung parenchyma, and morphological description of the bronchi. The fusion of the results of these extractors was achieved with a probabilistic network combining a priori knowledge of lung pathology. Creating a CT lung phantom allowed for the initial validation of the proposed methods. Performance of the techniques was then assessed with clinical trials involving other diagnostic tests and expert chest radiologists. The results of the proposed system for diagnostic decision-support demonstrated the feasibility and importance of information fusion in medical image interpretation.Open acces

    Brain Tumor Diagnosis Support System: A decision Fusion Framework

    Get PDF
    An important factor in providing effective and efficient therapy for brain tumors is early and accurate detection, which can increase survival rates. Current image-based tumor detection and diagnosis techniques are heavily dependent on interpretation by neuro-specialists and/or radiologists, making the evaluation process time-consuming and prone to human error and subjectivity. Besides, widespread use of MR spectroscopy requires specialized processing and assessment of the data and obvious and fast show of the results as photos or maps for routine medical interpretative of an exam. Automatic brain tumor detection and classification have the potential to offer greater efficiency and predictions that are more accurate. However, the performance accuracy of automatic detection and classification techniques tends to be dependent on the specific image modality and is well known to vary from technique to technique. For this reason, it would be prudent to examine the variations in the execution of these methods to obtain consistently high levels of achievement accuracy. Designing, implementing, and evaluating categorization software is the goal of the suggested framework for discerning various brain tumor types on magnetic resonance imaging (MRI) using textural features. This thesis introduces a brain tumor detection support system that involves the use of a variety of tumor classifiers. The system is designed as a decision fusion framework that enables these multi-classifier to analyze medical images, such as those obtained from magnetic resonance imaging (MRI). The fusion procedure is ground on the Dempster-Shafer evidence fusion theory. Numerous experimental scenarios have been implemented to validate the efficiency of the proposed framework. Compared with alternative approaches, the outcomes show that the methodology developed in this thesis demonstrates higher accuracy and higher computational efficiency

    Tracking the Temporal-Evolution of Supernova Bubbles in Numerical Simulations

    Get PDF
    The study of low-dimensional, noisy manifolds embedded in a higher dimensional space has been extremely useful in many applications, from the chemical analysis of multi-phase flows to simulations of galactic mergers. Building a probabilistic model of the manifolds has helped in describing their essential properties and how they vary in space. However, when the manifold is evolving through time, a joint spatio-temporal modelling is needed, in order to fully comprehend its nature. We propose a first-order Markovian process that propagates the spatial probabilistic model of a manifold at fixed time, to its adjacent temporal stages. The proposed methodology is demonstrated using a particle simulation of an interacting dwarf galaxy to describe the evolution of a cavity generated by a Supernov

    Dynamics under Uncertainty: Modeling Simulation and Complexity

    Get PDF
    The dynamics of systems have proven to be very powerful tools in understanding the behavior of different natural phenomena throughout the last two centuries. However, the attributes of natural systems are observed to deviate from their classical states due to the effect of different types of uncertainties. Actually, randomness and impreciseness are the two major sources of uncertainties in natural systems. Randomness is modeled by different stochastic processes and impreciseness could be modeled by fuzzy sets, rough sets, Dempster–Shafer theory, etc
    corecore