1,647 research outputs found

    Fog Computing in Medical Internet-of-Things: Architecture, Implementation, and Applications

    Full text link
    In the era when the market segment of Internet of Things (IoT) tops the chart in various business reports, it is apparently envisioned that the field of medicine expects to gain a large benefit from the explosion of wearables and internet-connected sensors that surround us to acquire and communicate unprecedented data on symptoms, medication, food intake, and daily-life activities impacting one's health and wellness. However, IoT-driven healthcare would have to overcome many barriers, such as: 1) There is an increasing demand for data storage on cloud servers where the analysis of the medical big data becomes increasingly complex, 2) The data, when communicated, are vulnerable to security and privacy issues, 3) The communication of the continuously collected data is not only costly but also energy hungry, 4) Operating and maintaining the sensors directly from the cloud servers are non-trial tasks. This book chapter defined Fog Computing in the context of medical IoT. Conceptually, Fog Computing is a service-oriented intermediate layer in IoT, providing the interfaces between the sensors and cloud servers for facilitating connectivity, data transfer, and queryable local database. The centerpiece of Fog computing is a low-power, intelligent, wireless, embedded computing node that carries out signal conditioning and data analytics on raw data collected from wearables or other medical sensors and offers efficient means to serve telehealth interventions. We implemented and tested an fog computing system using the Intel Edison and Raspberry Pi that allows acquisition, computing, storage and communication of the various medical data such as pathological speech data of individuals with speech disorders, Phonocardiogram (PCG) signal for heart rate estimation, and Electrocardiogram (ECG)-based Q, R, S detection.Comment: 29 pages, 30 figures, 5 tables. Keywords: Big Data, Body Area Network, Body Sensor Network, Edge Computing, Fog Computing, Medical Cyberphysical Systems, Medical Internet-of-Things, Telecare, Tele-treatment, Wearable Devices, Chapter in Handbook of Large-Scale Distributed Computing in Smart Healthcare (2017), Springe

    2-D ECG Compression Using Optimal Sorting and Mean Normalization

    Get PDF
    Abstract. In this paper, we propose an effective compression method for electrocardiogram (ECG) signals. 1-D ECG signals are reconstructed to 2-D ECG data by period and complexity sorting schemes with image compression techniques to increase inter and intra-beat correlation. The proposed method added block division and mean period normalization techniques on top of conventional 2-D data ECG compression methods. JPEG 2000 is chosen for compression of 2-D ECG data. Standard MIT BIH arrhythmia database is used for evaluation and experiment. The results show that the proposed method outperforms compared to the most recent literature especially in case of high compression rate

    Algorithms design for improving homecare using Electrocardiogram (ECG) signals and Internet of Things (IoT)

    Get PDF
    Due to the fast growing of population, a lot of hospitals get crowded from the huge amount of patients visits. Moreover, during COVID-19 a lot of patients prefer staying at home to minimize the spread of the virus. The need for providing care to patients at home is essential. Internet of Things (IoT) is widely known and used by different fields. IoT based homecare will help in reducing the burden upon hospitals. IoT with homecare bring up several benefits such as minimizing human exertions, economical savings and improved efficiency and effectiveness. One of the important requirement on homecare system is the accuracy because those systems are dealing with human health which is sensitive and need high amount of accuracy. Moreover, those systems deal with huge amount of data due to the continues sensing that need to be processed well to provide fast response regarding the diagnosis with minimum cost requirements. Heart is one of the most important organ in the human body that requires high level of caring. Monitoring heart status can diagnose disease from the early stage and find the best medication plan by health experts. Continues monitoring and diagnosis of heart could exhaust caregivers efforts. Having an IoT heart monitoring model at home is the solution to this problem. Electrocardiogram (ECG) signals are used to track heart condition using waves and peaks. Accurate and efficient IoT ECG monitoring at home can detect heart diseases and save human lives. As a consequence, an IoT ECG homecare monitoring model is designed in this thesis for detecting Cardiac Arrhythmia and diagnosing heart diseases. Two databases of ECG signals are used; one online which is old and limited, and another huge, unique and special from real patients in hospital. The raw ECG signal for each patient is passed through the implemented Low Pass filter and Savitzky Golay filter signal processing techniques to remove the noise and any external interference. The clear signal in this model is passed through feature extraction stage to extract number of features based on some metrics and medical information along with feature extraction algorithm to find peaks and waves. Those features are saved in the local database to apply classification on them. For the diagnosis purpose a classification stage is made using three classification ways; threshold values, machine learning and deep learning to increase the accuracy. Threshold values classification technique worked based on medical values and boarder lines. In case any feature goes above or beyond these ranges, a warning message appeared with expected heart disease. The second type of classification is by using machine learning to minimize the human efforts. A Support Vector Machine (SVM) algorithm is proposed by running the algorithm on the features extracted from both databases. The classification accuracy for online and hospital databases was 91.67% and 94% respectively. Due to the non-linearity of the decision boundary, a third way of classification using deep learning is presented. A full Multilayer Perceptron (MLP) Neural Network is implemented to improve the accuracy and reduce the errors. The number of errors reduced to 0.019 and 0.006 using online and hospital databases. While using hospital database which is huge, there is a need for a technique to reduce the amount of data. Furthermore, a novel adaptive amplitude threshold compression algorithm is proposed. This algorithm is able to make diagnosis of heart disease from the reduced size using compressed ECG signals with high level of accuracy and low cost. The extracted features from compressed and original are similar with only slight differences of 1%, 2% and 3% with no effects on machine learning and deep learning classification accuracy without the need for any reconstructions. The throughput is improved by 43% with reduced storage space of 57% when using data compression. Moreover, to achieve fast response, the amount of data should be reduced further to provide fast data transmission. A compressive sensing based cardiac homecare system is presented. It gives the channel between sender and receiver the ability to carry small amount of data. Experiment results reveal that the proposed models are more accurate in the classification of Cardiac Arrhythmia and in the diagnosis of heart diseases. The proposed models ensure fast diagnosis and minimum cost requirements. Based on the experiments on classification accuracy, number of errors and false alarms, the dictionary of the compressive sensing selected to be 900. As a result, this thesis provided three different scenarios that achieved IoT homecare Cardiac monitoring to assist in further research for designing homecare Cardiac monitoring systems. The experiment results reveal that those scenarios produced better results with high level of accuracy in addition to minimizing data and cost requirements

    Non-linear dynamical analysis of biosignals

    Get PDF
    Biosignals are physiological signals that are recorded from various parts of the body. Some of the major biosignals are electromyograms (EMG), electroencephalograms (EEG) and electrocardiograms (ECG). These signals are of great clinical and diagnostic importance, and are analysed to understand their behaviour and to extract maximum information from them. However, they tend to be random and unpredictable in nature (non-linear). Conventional linear methods of analysis are insufficient. Hence, analysis using non-linear and dynamical system theory, chaos theory and fractal dimensions, is proving to be very beneficial. In this project, ECG signals are of interest. Changes in the normal rhythm of a human heart may result in different cardiac arrhythmias, which may be fatal or cause irreparable damage to the heart when sustained over long periods of time. Hence the ability to identify arrhythmias from ECG recordings is of importance for clinical diagnosis and treatment and also for understanding the electrophysiological mechanism of arrhythmias. To achieve this aim, algorithms were developed with the help of MATLABĀ® software. The classical logic of correlation was used in the development of algorithms to place signals into the various categories of cardiac arrhythmias. A sample set of 35 known ECG signals were obtained from the Physionet website for testing purposes. Later, 5 unknown ECG signals were used to determine the efficiency of the algorithms. A peak detection algorithm was written to detect the QRS complex. This complex is the most prominent waveform within an ECG signal and its shape, duration and time of occurrence provides valuable information about the current state of the heart. The peak detection algorithm gave excellent results with very good accuracy for all the downloaded ECG signals, and was developed using classical linear techniques. Later, a peak detection algorithm using the discrete wavelet transform (DWT) was implemented. This code was developed using nonlinear techniques and was amenable for implementation. Also, the time required for execution was reduced, making this code ideal for real-time processing. Finally, algorithms were developed to calculate the Kolmogorov complexity and Lyapunov exponent, which are nonlinear descriptors and enable the randomness and chaotic nature of ECG signals to be estimated. These measures of randomness and chaotic nature enable us to apply correct interrogative methods to the signal to extract maximum information. The codes developed gave fair results. It was possible to differentiate between normal ECGs and ECGs with ventricular fibrillation. The results show that the Kolmogorov complexity measure increases with an increase in pathology, approximately 12.90 for normal ECGs and increasing to 13.87 to 14.39 for ECGs with ventricular fibrillation and ventricular tachycardia. Similar results were obtained for Lyapunov exponent measures with a notable difference between normal ECG (0 ā€“ 0.0095) and ECG with ventricular fibrillation (0.1114 ā€“ 0.1799). However, it was difficult to differentiate between different types of arrhythmias.Biosignals are physiological signals that are recorded from various parts of the body. Some of the major biosignals are electromyograms (EMG), electroencephalograms (EEG) and electrocardiograms (ECG). These signals are of great clinical and diagnostic importance, and are analysed to understand their behaviour and to extract maximum information from them. However, they tend to be random and unpredictable in nature (non-linear). Conventional linear methods of analysis are insufficient. Hence, analysis using non-linear and dynamical system theory, chaos theory and fractal dimensions, is proving to be very beneficial. In this project, ECG signals are of interest. Changes in the normal rhythm of a human heart may result in different cardiac arrhythmias, which may be fatal or cause irreparable damage to the heart when sustained over long periods of time. Hence the ability to identify arrhythmias from ECG recordings is of importance for clinical diagnosis and treatment and also for understanding the electrophysiological mechanism of arrhythmias. To achieve this aim, algorithms were developed with the help of MATLABĀ® software. The classical logic of correlation was used in the development of algorithms to place signals into the various categories of cardiac arrhythmias. A sample set of 35 known ECG signals were obtained from the Physionet website for testing purposes. Later, 5 unknown ECG signals were used to determine the efficiency of the algorithms. A peak detection algorithm was written to detect the QRS complex. This complex is the most prominent waveform within an ECG signal and its shape, duration and time of occurrence provides valuable information about the current state of the heart. The peak detection algorithm gave excellent results with very good accuracy for all the downloaded ECG signals, and was developed using classical linear techniques. Later, a peak detection algorithm using the discrete wavelet transform (DWT) was implemented. This code was developed using nonlinear techniques and was amenable for implementation. Also, the time required for execution was reduced, making this code ideal for real-time processing. Finally, algorithms were developed to calculate the Kolmogorov complexity and Lyapunov exponent, which are nonlinear descriptors and enable the randomness and chaotic nature of ECG signals to be estimated. These measures of randomness and chaotic nature enable us to apply correct interrogative methods to the signal to extract maximum information. The codes developed gave fair results. It was possible to differentiate between normal ECGs and ECGs with ventricular fibrillation. The results show that the Kolmogorov complexity measure increases with an increase in pathology, approximately 12.90 for normal ECGs and increasing to 13.87 to 14.39 for ECGs with ventricular fibrillation and ventricular tachycardia. Similar results were obtained for Lyapunov exponent measures with a notable difference between normal ECG (0 ā€“ 0.0095) and ECG with ventricular fibrillation (0.1114 ā€“ 0.1799). However, it was difficult to differentiate between different types of arrhythmias

    Identification of cardiac signals in ambulatory ECG data

    Get PDF
    The Electrocardiogram (ECG) is the primary tool for monitoring heart function. ECG signals contain vital information about the heart which informs diagnosis and treatment of cardiac conditions. The diagnosis of many cardiac arrhythmias require long term and continuous ECG data, often while the participant engages in activity. Wearable ambulatory ECG (AECG) systems, such as the common Holter system, allow heart monitoring for hours or days. The technological trajectory of AECG systems aims towards continuous monitoring during a wide range of activities with data processed locally in real time and transmitted to a monitoring centre for further analysis. Furthermore, hierarchical decision systems will allow wearable systems to produce alerts or even interventions. These functions could be integrated into smartphones.A fundamental limitation of this technology is the ability to identify heart signal characteristics in ECG signals contaminated with high amplitude and non-stationary noise. Noise processing become more severe as activity levels increase, and this is also when many heart problems are present.This thesis focuses on the identification of heart signals in AECG data recorded during participant activity. In particular, it explored ECG filters to identify major heart conditions in noisy AECG data. Gold standard methods use Extended Kalman filters with extrapolation based on sum of Gaussian models. New methods are developed using linear Kalman filtering and extrapolation based on a sum of Principal Component basis signals. Unlike the gold standard methods, extrapolation is heartcycle by heartcycle. Several variants are explored where basic signals span one or two heartcycles, and applied to single or multi-channel ECG data.The proposed methods are extensively tested against standard databases or normal and abnormal ECG data and the performance is compared to gold standard methods. Two performance metrics are used: improvement in signal to noise ratio and the observability of clinically important features in the heart signal. In all tests the proposed method performs better, and often significantly better, than the gold standard methods. It is demonstrated that abnormal ECG signals can be identified in noisy AECG data

    Biomedical Applications of the Discrete Wavelet Transform

    Get PDF

    Wearable Wireless Devices

    Get PDF
    No abstract available

    Compression of an ECG Signal Using Mixed Transforms

    Get PDF
    Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the ECG data in the 2- D form. The compression algorithms were implemented and tested using multiwavelet, wavelet and slantlet transforms to form the proposed method based on mixed transforms. Then vector quantization technique was employed to extract the mixed transform coefficients. Some selected records from MIT/BIH arrhythmia database were tested contrastively and the performance of the proposed methods was analyzed and evaluated using MATLAB package. Simulation results showed that the proposed methods gave a high compression ratio (CR) for the ECG signals comparing with other available methods. For example, the compression of one record (record 100) yielded CR of 24.4 associated with percent root mean square difference (PRD) of 2.56% was achieved
    • ā€¦
    corecore