7 research outputs found

    Classification of fundus images for diabetic retinopathy using artificial neural network

    Get PDF
    People with diabetes may suffer from an eye disease called Diabetic Retinopathy (DR). This is caused by damage to the blood vessels of the light-sensitive tissue at the back of the eye (i.e retina). Fundus images obtained from fundus camera are often imperfect; normally are in low contrast and blurry. Hence, causing difficulty in accurately classifying diabetic retinopathy disease. This study focuses on classification of fundus image that contains with or without signs of DR and utilizes artificial neural network (NN) namely Multi-layered Perceptron (MLP) trained by Levenberg-Marquardt (LM) and Bayesian Regularization (BR) to classify the data. Nineteen features have been extracted from fundus image and used as neural network inputs for the classification. For analysis, evaluation were made using different number of hidden nodes. It is learned that MLP trained with BR provides a better classification performance with 72.11% (training) and 67.47% (testing) as compared to the use of LM. Such a finding indicates the possibility of utilizing BR for other artificial neural network model

    A Self-Attention Deep Neural Network Regressor for real time blood glucose estimation in paediatric population using physiological signals

    Get PDF
    With the advent of modern digital technology, the physiological signals (such as electrocardiogram) are being acquired from portable wearable devices which are being used for non-invasive chronic disease management (such as Type 1 Diabetes). The diabetes management requires real-time assessment of blood glucose which is cumbersome for paediatric population due to clinical complexity and invasiveness. Therefore, real-time non-invasive blood glucose estimation is now pivotal for effective diabetes management. In this paper, we propose a Self-Attention Deep Neural Network Regressor for real-time non-invasive blood glucose estimation for paediatric population based on automatically extracted beat morphology. The first stage performs Morphological Extractor based on Self-Attention based Long Short-Term Memory driven by Convolutional Neural Network for highlighting local features based on temporal context. The second stage is based on Morphological Regressor driven by multilayer perceptron with dropout and batch normalization to avoid overfitting. We performed feature selection via logit model followed by Spearman's correlation among features to avoid feature redundancy. We trained as tested our model on publicly available MIT/BIH-Physionet databases and physiological signals acquired from a T1D paediatric population. We performed our evaluation via Clarke's Grid error to analyse estimation accuracy on range of blood values under different glycaemic conditions. The results show that our tool outperformed existing regression models with 89% accuracy under clinically acceptable range. The proposed model based on beat morphology significantly outperformed models based on HRV features

    A self-attention deep neural network regressor for real time blood glucose estimation in paediatric population using physiological signals

    Get PDF
    With the advent of modern digital technology, the physiological signals (such as electrocardiogram) are being acquired from portable wearable devices which are being used for non-invasive chronic disease management (such as Type 1 Diabetes). The diabetes management requires real-time assessment of blood glucose which is cumbersome for paediatric population due to clinical complexity and invasiveness. Therefore, real-time non-invasive blood glucose estimation is now pivotal for effective diabetes management. In this paper, we propose a Self-Attention Deep Neural Network Regressor for real-time non-invasive blood glucose estimation for paediatric population based on automatically extracted beat morphology. The first stage performs Morphological Extractor based on Self-Attention based Long Short-Term Memory driven by Convolutional Neural Network for highlighting local features based on temporal context. The second stage is based on Morphological Regressor driven by multilayer perceptron with dropout and batch normalization to avoid overfitting. We performed feature selection via logit model followed by Spearman’s correlation among features to avoid feature redundancy. We trained as tested our model on publicly available MIT/BIH-Physionet databases and physiological signals acquired from a T1D paediatric population

    Neural network approach for non-invasive detection of hyperglycemia using electrocardiographic signals

    Full text link
    © 2014 IEEE. Hyperglycemia or high blood glucose (sugar) level is a common dangerous complication among patients with Type 1 diabetes mellitus (T1DM). Hyperglycemia can cause serious health problems if left untreated such as heart disease, stroke, vision and nerve problems. Based on the electrocardiographic (ECG) parameters, we have identified hyperglycemic and normoglycemic states in T1DM patients. In this study, a classification unit is introduced with the approach of feed forward multi-layer neural network to detect the presences of hyperglycemic/normoglycemic episodes using ECG parameters as inputs. A practical experiment using the real T1DM patients' data sets collected from Department of Health, Government of Western Australia is studied. Experimental results show that proposed ECG parameters contributed significantly to the good performance of hyperglycemia detections in term of sensitivity, specificity and geometric mean (70.59%, 65.38%, and 67.94%, respectively). From these results, it is proved that hyperglycemic events in T1DM can be detected non-invasively and effectively by using ECG signals and ANN approach

    EDMON - Electronic Disease Surveillance and Monitoring Network: A Personalized Health Model-based Digital Infectious Disease Detection Mechanism using Self-Recorded Data from People with Type 1 Diabetes

    Get PDF
    Through time, we as a society have been tested with infectious disease outbreaks of different magnitude, which often pose major public health challenges. To mitigate the challenges, research endeavors have been focused on early detection mechanisms through identifying potential data sources, mode of data collection and transmission, case and outbreak detection methods. Driven by the ubiquitous nature of smartphones and wearables, the current endeavor is targeted towards individualizing the surveillance effort through a personalized health model, where the case detection is realized by exploiting self-collected physiological data from wearables and smartphones. This dissertation aims to demonstrate the concept of a personalized health model as a case detector for outbreak detection by utilizing self-recorded data from people with type 1 diabetes. The results have shown that infection onset triggers substantial deviations, i.e. prolonged hyperglycemia regardless of higher insulin injections and fewer carbohydrate consumptions. Per the findings, key parameters such as blood glucose level, insulin, carbohydrate, and insulin-to-carbohydrate ratio are found to carry high discriminative power. A personalized health model devised based on a one-class classifier and unsupervised method using selected parameters achieved promising detection performance. Experimental results show the superior performance of the one-class classifier and, models such as one-class support vector machine, k-nearest neighbor and, k-means achieved better performance. Further, the result also revealed the effect of input parameters, data granularity, and sample sizes on model performances. The presented results have practical significance for understanding the effect of infection episodes amongst people with type 1 diabetes, and the potential of a personalized health model in outbreak detection settings. The added benefit of the personalized health model concept introduced in this dissertation lies in its usefulness beyond the surveillance purpose, i.e. to devise decision support tools and learning platforms for the patient to manage infection-induced crises

    Contribuciones de las técnicas machine learning a la cardiología. Predicción de reestenosis tras implante de stent coronario

    Get PDF
    [ES]Antecedentes: Existen pocos temas de actualidad equiparables a la posibilidad de la tecnología actual para desarrollar las mismas capacidades que el ser humano, incluso en medicina. Esta capacidad de simular los procesos de inteligencia humana por parte de máquinas o sistemas informáticos es lo que conocemos hoy en día como inteligencia artificial. Uno de los campos de la inteligencia artificial con mayor aplicación a día de hoy en medicina es el de la predicción, recomendación o diagnóstico, donde se aplican las técnicas machine learning. Asimismo, existe un creciente interés en las técnicas de medicina de precisión, donde las técnicas machine learning pueden ofrecer atención médica individualizada a cada paciente. El intervencionismo coronario percutáneo (ICP) con stent se ha convertido en una práctica habitual en la revascularización de los vasos coronarios con enfermedad aterosclerótica obstructiva significativa. El ICP es asimismo patrón oro de tratamiento en pacientes con infarto agudo de miocardio; reduciendo las tasas de muerte e isquemia recurrente en comparación con el tratamiento médico. El éxito a largo plazo del procedimiento está limitado por la reestenosis del stent, un proceso patológico que provoca un estrechamiento arterial recurrente en el sitio de la ICP. Identificar qué pacientes harán reestenosis es un desafío clínico importante; ya que puede manifestarse como un nuevo infarto agudo de miocardio o forzar una nueva resvascularización del vaso afectado, y que en casos de reestenosis recurrente representa un reto terapéutico. Objetivos: Después de realizar una revisión de las técnicas de inteligencia artificial aplicadas a la medicina y con mayor profundidad, de las técnicas machine learning aplicadas a la cardiología, el objetivo principal de esta tesis doctoral ha sido desarrollar un modelo machine learning para predecir la aparición de reestenosis en pacientes con infarto agudo de miocardio sometidos a ICP con implante de un stent. Asimismo, han sido objetivos secundarios comparar el modelo desarrollado con machine learning con los scores clásicos de riesgo de reestenosis utilizados hasta la fecha; y desarrollar un software que permita trasladar esta contribución a la práctica clínica diaria de forma sencilla. Para desarrollar un modelo fácilmente aplicable, realizamos nuestras predicciones sin variables adicionales a las obtenidas en la práctica rutinaria. Material: El conjunto de datos, obtenido del ensayo GRACIA-3, consistió en 263 pacientes con características demográficas, clínicas y angiográficas; 23 de ellos presentaron reestenosis a los 12 meses después de la implantación del stent. Todos los desarrollos llevados a cabo se han hecho en Python y se ha utilizado computación en la nube, en concreto AWS (Amazon Web Services). Metodología: Se ha utilizado una metodología para trabajar con conjuntos de datos pequeños y no balanceados, siendo importante el esquema de validación cruzada anidada utilizado, así como la utilización de las curvas PR (precision-recall, exhaustividad-sensibilidad), además de las curvas ROC, para la interpretación de los modelos. Se han entrenado los algoritmos más habituales en la literatura para elegir el que mejor comportamiento ha presentado. Resultados: El modelo con mejores resultados ha sido el desarrollado con un clasificador extremely randomized trees; que superó significativamente (0,77; área bajo la curva ROC a los tres scores clínicos clásicos; PRESTO-1 (0,58), PRESTO-2 (0,58) y TLR (0,62). Las curvas exhaustividad sensibilidad ofrecieron una imagen más precisa del rendimiento del modelo extremely randomized trees que muestra un algoritmo eficiente (0,96) para no reestenosis, con alta exhaustividad y alta sensibilidad. Para un umbral considerado óptimo, de 1,000 pacientes sometidos a implante de stent, nuestro modelo machine learning predeciría correctamente 181 (18%) más casos en comparación con el mejor score de riesgo clásico (TLR). Las variables más importantes clasificadas según su contribución a las predicciones fueron diabetes, enfermedad coronaria en 2 ó más vasos, flujo TIMI post-ICP, plaquetas anormales, trombo post-ICP y colesterol anormal. Finalmente, se ha desarrollado una calculadora para trasladar el modelo a la práctica clínica. La calculadora permite estimar el riesgo individual de cada paciente y situarlo en una zona de riesgo, facilitando la toma de decisión al médico en cuanto al seguimiento adecuado para el mismo. Conclusiones: Aplicado inmediatamente después de la implantación del stent, un modelo machine learning diferencia mejor a aquellos pacientes que presentarán o no reestenosis respecto a los discriminadores clásicos actuales

    Application of machine learning and NIR spectroscopy for monitoring patients on hemodyalisis

    Get PDF
    Koncept adekvatnosti dijalize ne podrazumeva samo uklanjanje uremijskih simptoma, već i potpunu rehabilitaciju i stabilizaciju stanja pacijenta sa terminalnom insuficijencijom bubrega. Posledice neadekvatnog sistema monitoringa počivaju na činjenici da se provere adekvantosti dijalize u praksi ne izvode dovoljno često, a da u međuvremenu bolesnici mogu biti subdijalizirani. Ova pojava može imati mnoštvo negativnih efekata na pacijenta. Dosadašnja istraživanja usmerena su pretežno na detekciju uremijskih toksina, i to uglavnom uree i kreatinina, i njihovu eliminaciju iz organizma, obično na osnovu skeniranja otpadnog dijalizata uz pomoć neke od optičkih metoda (prevashodno pomoću UV izvora svetlosti). Za sada nema podataka o praćenju nivoa glukoze u krvi pacijenta, stepena anemije, uremije i nivoa elektrolita tokom hemodijalize, na osnovu NIR spektralne informacije otpadnog dijalizata. Praćenje većeg broja parametara koji odslikavaju stanje pacijenta dalo bi kompletniju sliku i približilo se konceptu optimalne dijalize. Istraživanje je obuhvatilo metodu UV-VIS-NIR spektrometrije (Lambda 950, Perkin Elmer), pri čemu je merenje spektara otpadnog dijalizata bilo izvršeno na Mašinskom fakultetu Univerziteta u Beogradu u okviru laboratorije NanoLab. Pored metode UV-VIS-NIR spektrometrije, korišćene su konvencionalne laboratorijske metode koje se koriste u kliničkoj praksi i koje su predstavljale referentne vrednosti za upoređivanje sa eksperimentalnim rezultatima. Softversko rešenje sistema za praćenje parametara otpadnog dijalizata podrazumevalo je implementaciju metoda mašinskog učenja, koje služi za utvrđivanje relacije između spektralnih karakteristika otpadnog dijalizata i parametara izmerenih u krvi. Akvizicija spektara i algoritmi mašinskog učenja napisani su u programskom paketu Python i Matlab®. Izvršena je klasifikacija i regresija parametara koji definišu hiperglikemiju, anemiju i uremiju. Hiperglikemija je praćena kroz koncentraciju glukoze u krvi i tom prilikom AUC zabeležena je vrednost od 0.91 i korelacioni koeficijent od 0.93. Anemija je ogledana kroz koncentraciju parametara eritrocita, hematrokrita, hemoglobina, MCV, MCHC, MCV, nivoa gvožđa u serumu, i tom prilikom prosečna AUC vrednost bila je viša od 0.9, dok je korelacioni koeficijent regresije imao vrednost takođe višu od 0.9. Uremija se ogledala kroz koncentraciju kreatinina, uree i mokraćne kiseline i tom prilikom prosečna AUC vrednost svih parametara bila je viša od 0.85 dok je korelacioni koeficijent imao prosečnu vrednost za sve parametre od 0.91. Metode mašinskog učenja na mnogobrojne načine, bile su primenjene za modeliranje matematičkog modela i sistema za neinvazivnu predikciju koncentracije određenih elemenata. Metode mašinskog učenja su u mogućnosti da, zahvaljujući optičkim, hemijskim, električnim i mikrosenzornim tehnikama, znatno poboljšaju dosadašnje rezultate postignute u neinvazivnim metodama monitoringa hemodijalize.The concept of dialysis adequacy is focused on achieving the complete rehabilitation and stabilization of a patient’s condition, and not merely the elimination of uremic symptoms, . In standard clinical practice the delivered dose of hemodialysis is determined from the urea concentration in patients’ blood samples collected before and after a hemodialysis treatment once a month or even less frequently. Previous research has focused mainly on the detection and elimination of uremic toxins, mainly urea and creatinine, and their elimination from the body, usually based on scanning of waste dialysate using some of the optical methods (primarily using UV light sources). There are currently no data on monitoring the patient's blood glucose levels, degree of anemia or uremia, or electrolyte levels during dialysis, based on the NIR spectral information of the waste dialysate. Monitoring several parameters that reflect the patient's condition would provide a more complete picture and approach the concept of optimal dialysis monitoring. The research was conducted using UV-VIS-NIR spectrometry (Lambda 950, Perkin Elmer). The measurements of waste dialysate spectra were performed at the Faculty of Mechanical Engineering, University of Belgrade within the NanoLab laboratory. In addition to the UV-VIS-NIR spectrometry, conventional clinical practice laboratory methods were used to obtain reference red blood analysis values. The software solution devised for monitoring waste dialysate parameters involved the implementation of machine learning methods, which serve to determine the relationship between its spectral characteristics and the parameters measured in the blood. Spectrum acquisition and machine learning algorithms are written in Python and Matlab®. Classification and regression of parameters defining hyperglycemia, anemia and uremia were performed. Hyperglycemia was monitored through blood glucose concentration and on that occasion AUC value of 0.91 and correlation coefficient of 0.93. Anemia was reflected through the concentration of parameters of erythrocytes, hematocrit, hemoglobin, MCV, MCHC, MCV, serum iron levels, and on that occasion the average AUC value was higher than 0.9, while the correlation regression coefficient was also higher than 0.9. Uremia was reflected through the concentration of creatinine, urea and uric acid and on that occasion the average AUC value of all parameters was higher than 0.85 while the correlation coefficient had an average value for all parameters of 0.91. NIR data contains a huge amount of information, usually of very high dimension, which lends itself to the successful implementation of machine learning methods. Machine learning is a set of methods that can automatically detect patterns in data, and then use the detected patterns to make predictions on future data relevant to the hemodialysis process
    corecore