5 research outputs found

    A novel automated tower graph based ECG signal classification method with hexadecimal local adaptive binary pattern and deep learning

    Get PDF
    Electrocardiography (ECG) signal recognition is one of the popular research topics for machine learning. In this paper, a novel transformation called tower graph transformation is proposed to classify ECG signals with high accuracy rates. It employs a tower graph, which uses minimum, maximum and average pooling methods altogether to generate novel signals for the feature extraction. In order to extract meaningful features, we presented a novel one-dimensional hexadecimal pattern. To select distinctive and informative features, an iterative ReliefF and Neighborhood Component Analysis (NCA) based feature selection is utilized. By using these methods, a novel ECG signal classification approach is presented. In the preprocessing phase, tower graph-based pooling transformation is applied to each signal. The proposed one-dimensional hexadecimal adaptive pattern extracts 1536 features from each node of the tower graph. The extracted features are fused and 15,360 features are obtained and the most discriminative 142 features are selected by the ReliefF and iterative NCA (RFINCA) feature selection approach. These selected features are used as an input to the artificial neural network and deep neural network and 95.70% and 97.10% classification accuracy was obtained respectively. These results demonstrated the success of the proposed tower graph-based method.</p

    Detection of Myocardial Infarction using ECG and Multi-Scale Feature Concatenate

    Get PDF
    Diverse computer-aided diagnosis systems based on convolutional neural networks were applied to automate the detection of myocardial infarction (MI) found in electrocardiogram (ECG) for early diagnosis and prevention. However; issues; particularly overfitting and underfitting; were not being taken into account. In other words; it is unclear whether the network structure is too simple or complex. Toward this end; the proposed models were developed by starting with the simplest structure: a multi-lead features-concatenate narrow network (N-Net) in which only two convolutional layers were included in each lead branch. Additionally; multi-scale features-concatenate networks (MSN-Net) were also implemented where larger features were being extracted through pooling the signals. The best structure was obtained via tuning both the number of filters in the convolutional layers and the number of inputting signal scales. As a result; the N-Net reached a 95.76% accuracy in the MI detection task; whereas the MSN-Net reached an accuracy of 61.82% in the MI locating task. Both networks give a higher average accuracy and a significant difference of p \u3c 0.001 evaluated by the U test compared with the state-of-the-art. The models are also smaller in size thus are suitable to fit in wearable devices for offline monitoring. In conclusion; testing throughout the simple and complex network structure is indispensable. However; the way of dealing with the class imbalance problem and the quality of the extracted features are yet to be discussed

    Intelligent Biosignal Analysis Methods

    Get PDF
    This book describes recent efforts in improving intelligent systems for automatic biosignal analysis. It focuses on machine learning and deep learning methods used for classification of different organism states and disorders based on biomedical signals such as EEG, ECG, HRV, and others

    Enabling cardiovascular multimodal, high dimensional, integrative analytics

    Get PDF
    While traditionally the understanding of cardiovascular morbidity relied on the acquisition and interpretation of health data, the advances in health technologies has enabled us to collect far larger amount of health data. This thesis explores the application of advanced analytics that utilise powerful mechanisms for integrating health data across different modalities and dimensions into a single and holistic environment to better understand different diseases, with a focus on cardiovascular conditions. Different statistical methodologies are applied across a number of case studies supported by a novel methodology to integrate and simplify data collection. The work culminates in the different dataset modalities explaining different effects on morbidity: blood biomarkers, electrocardiogram recordings, RNA-Seq measurements, and different population effects piece together the understanding of a person morbidity. More specifically, explainable artificial intelligence methods were employed on structured datasets from patients with atrial fibrillation to improve the screening for the disease. Omics datasets, including RNA-sequencing and genotype datasets, were examined and new biomarkers were discovered allowing a better understanding of atrial fibrillation. Electrocardiogram signal data were used to assess the early risk prediction of heart failure, enabling clinicians to use this novel approach to estimate future incidences. Population-level data were applied to the identification of associations and temporal trajectory of diseases to better understand disease dependencies in different clinical cohorts
    corecore