7 research outputs found

    Arrhythmia detection using resampling and deep learning methods on unbalanced data

    Get PDF
    Due to cardiovascular diseases millions of people die around the world. One way to detect abnormality in the heart condition is with the help of electrocardiogram signal (ECG) analysis. This paper's goal is to use machine learning and deep learning methods such as Support Vector Machines (SVM), Random Forests, Light Gradient Boosting Machine (LightGBM), Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (BLSTM) to classify arrhythmias, where particular interest represent the rare cases of disease. In order to deal with the problem of imbalance in the dataset we used resampling methods such as SMOTE Tomek-Links and SMOTE ENN to improve the representation ration of the minority classes. Although the machine learning models did not improve a lot when trained on the resampled dataset, the deep learning models showed more impressive results. In particular, LSTM model fitted on dataset resampled using SMOTE ENN method provides the most optimal precision-recall trade-off for the minority classes Supraventricular beat and Fusion of ventricular and normal beat, with recall of 83 % and 88 % and precision of 74 % and 66 % for the two classes re-spectively, whereas the macro-weighted recall is 92 % and precision is 82 % .The authors would like to acknowledge the use of the University of Oxford Advanced Research Computing (ARC) facility in carrying out this work: http://dx.doi.org/10.5281/zenodo.22558. Specifications: https://www.arc.ox.ac.uk/arc-systems

    Advancing prognostic precision in pulmonary embolism: A clinical and laboratory-based artificial intelligence approach for enhanced early mortality risk stratification

    Get PDF
    Background Acute pulmonary embolism (PE) is a critical medical emergency that necessitates prompt identification and intervention. Accurate prognostication of early mortality is vital for recognizing patients at elevated risk for unfavourable outcomes and administering suitable therapy. Machine learning (ML) algorithms hold promise for enhancing the precision of early mortality prediction in PE patients. Objective To devise an ML algorithm for early mortality prediction in PE patients by employing clinical and laboratory variables. Methods This study utilized diverse oversampling techniques to improve the performance of various machine learning models including ANN, SVM, DT, RF, and AdaBoost for early mortality prediction. Appropriate oversampling methods were chosen for each model based on algorithm characteristics and dataset properties. Predictor variables included four lab tests, eight physiological time series indicators, and two general descriptors. Evaluation used metrics like accuracy, F1_score, precision, recall, Area Under the Curve (AUC) and Receiver Operating Characteristic (ROC) curves, providing a comprehensive view of models' predictive abilities. Results The findings indicated that the RF model with random oversampling exhibited superior performance among the five models assessed, achieving elevated accuracy and precision alongside high recall for predicting the death class. The oversampling approaches effectively equalized the sample distribution among the classes and enhanced the models' performance. Conclusions The suggested ML technique can efficiently prognosticate mortality in patients afflicted with acute PE. The RF model with random oversampling can aid healthcare professionals in making well-informed decisions regarding the treatment of patients with acute PE. The study underscores the significance of oversampling methods in managing imbalanced data and emphasizes the potential of ML algorithms in refining early mortality prediction for PE patients

    Intelligent Planning for Refractive Surgeries: A Modelling and Visualisation-based Approach

    Get PDF
    Laser refractive surgeries have been commonly used in ophthalmic operations. Considerable research has been carried out and encouraging progress made in recent years. It covers properties of the cornea and behaviour of tissue in different parts of the eye, topography and material expression of individual patient's eyes, prediction using finite element (FE) analysis to estimate the corneal shape change and the change in refractive power. Further effort is still required to advance the research to aid the decision making for laser refractive surgeries. This study comprehensively reviews the latest techniques of refractive surgery and research on computational analysis and modelling techniques and their applications, especially the current prediction and planning techniques for laser refractive surgeries. The aim of this study is to develop an intelligent assistant tool for the laser refractive surgeries with prediction and visualisation functions. For this aim, two objectives will be achieved: prediction with the clinical dataset and human vision simulation. Due to clinical statistics, the clinical dataset is often incomplete, imbalanced, and sparse. Three methods are proposed to predict surgery parameters and outcomes using the clinical dataset. A multiple imputation method, with multiple regression, is proposed for imputing the missing data. For the imbalance of data distribution in the clinical dataset, an over-sampling of the minority data method is proposed. The accuracy of predicted minority data is close to the accuracy of predicted majority data. Finally an ensemble learning method which is optimised by the genetic algorithm is proposed to improve the accuracy of the prediction results with a sparse dataset. According to the distribution of the sample in the clinical data, the percentage of unacceptable results is 23.02%. The methods in this study could provide an accuracy of 79.02% to find the possible unacceptable cases, that is, the method could reduce the percentage of unacceptable results from 23.02% to 4.82%. In human vision simulation, the study focuses on how the human vision simulation could be determined and obtained accurately within a required timeframe. The ray tracing technique can provide more precise results than the rasterisation technique, especially for the simulation of light reflection and refraction in the human eyeball. However, the thin lens assumption affects the accuracy of the pathological vision simulation with the ray tracing technique. An improved schematic human eye model is proposed to obtain a numerical model predicting the size of the defocus blur for the pathological vision, which wraps the shape of the ray intersection area. In order to generalise this model to other healthy and pathological vision, an intelligent blur range derivation method is proposed. On the other hand, ray tracing scene rendering requires repeated iterative computing which takes a significant amount of computation time. A GPU-based ray tracing computing method is proposed to accelerate and optimise the rendering of scenes. With this method, the scene rendering speed is about 75 times faster than using the CPU

    Advances in Data Mining Knowledge Discovery and Applications

    Get PDF
    Advances in Data Mining Knowledge Discovery and Applications aims to help data miners, researchers, scholars, and PhD students who wish to apply data mining techniques. The primary contribution of this book is highlighting frontier fields and implementations of the knowledge discovery and data mining. It seems to be same things are repeated again. But in general, same approach and techniques may help us in different fields and expertise areas. This book presents knowledge discovery and data mining applications in two different sections. As known that, data mining covers areas of statistics, machine learning, data management and databases, pattern recognition, artificial intelligence, and other areas. In this book, most of the areas are covered with different data mining applications. The eighteen chapters have been classified in two parts: Knowledge Discovery and Data Mining Applications

    Personalisation of heart failure care using clinical trial data

    Get PDF
    Heart failure is a common, debilitating and life limiting disease, resulting in a large burden for both the individual patient and healthcare provision. Therefore, optimisation of treatments for these patients is of prime importance. Heart failure with reduced ejection fraction has a large evidence base for effective treatments, and more recently effective treatments have started to be identified for those with preserved ejection fraction. The effectiveness of these treatments is calculated at a population level, and there is a great deal of interest to try and identify if different patients may benefit more from certain treatments. In addition, we wish to understand more about different phenotypes in heart failure, to help understand what the patient might expect for the trajectory of their illness and potentially develop targeted treatments. To explore these issues further, this thesis presents several approaches using heart failure clinical trial data to try and further understand the patient journey and explore how treatment may be delivered in a more personalised fashion. The first analyses look at the patterns of heart failure hospitalisations, including the timing of admissions, and the relationship with different modes of death. This was examined in both heart failure with preserved and reduced ejection fraction. The accepted trajectory of recurrent admissions falling closer together over time was confirmed, and admissions closer together were linked to a higher risk of cardiovascular death, particularly due to progressive pump failure. Sudden death did appear to be truly sudden and not strongly linked to hospitalisations. The next approach was to perform latent class analysis to try and identify clusters of patients, or phenotypes, within heart failure with preserved and reduced ejection fraction separately using a data driven method. Phenotypes were identified with consistency across different data and using different approaches. These phenotypes were clinically recognisable. Identifying phenotypes in this way may be a route to looking for differential responses to treatments. Lastly, supervised machine learning methods were used to predict outcomes in patients with heart failure and reduced ejection fraction. These techniques provide more analytical flexibility, but did not show performance benefit compared with prognostic models based on survival analysis methods. Overall, the predictive abilities were modest. In conclusion, several avenues were explored to help understand the patient journey in heart failure, aiming to give more detail about the expected patient trajectory and exploring methods to examine for differential treatment responses in phenotypes of patients in heart failure

    Advanced Process Monitoring for Industry 4.0

    Get PDF
    This book reports recent advances on Process Monitoring (PM) to cope with the many challenges raised by the new production systems, sensors and β€œextreme data” conditions that emerged with Industry 4.0. Concepts such as digital-twins and deep learning are brought to the PM arena, pushing forward the capabilities of existing methodologies to handle more complex scenarios. The evolution of classical paradigms such as Latent Variable modeling, Six Sigma and FMEA are also covered. Applications span a wide range of domains such as microelectronics, semiconductors, chemicals, materials, agriculture, as well as the monitoring of rotating equipment, combustion systems and membrane separation processes
    corecore