865 research outputs found

    Artificial Intelligence for Detecting Preterm Uterine Activity in Gynacology and Obstertric Care

    Get PDF
    Preterm birth brings considerable emotional and economic costs to families and society. However, despite extensive research into understanding the risk factors, the prediction of patient mechanisms and improvements to obstetrical practice, the UK National Health Service still annually spends more than £2.95 billion on this issue. Diagnosis of labour in normal pregnancies is important for minimizing unnecessary hospitalisations, interventions and expenses. Moreover, accurate identification of spontaneous preterm labour would also allow clinicians to start necessary treatments early in women with true labour and avert unnecessary treatment and hospitalisation for women who are simply having preterm contractions, but who are not in true labour. In this research, the Electrohysterography signals have been used to detect preterm births, because Electrohysterography signals provide a strong basis for objective prediction and diagnosis of preterm birth. This has been achieved using an open dataset, which contains 262 records for women who delivered at term and 38 who delivered prematurely. Three different machine learning algorithm were used to identify these records. The results illustrate that the Random Forest performed the best of sensitivity 97%, specificity of 85%, Area under the Receiver Operator curve (AUROC) of 94% and mean square error rate of 14%

    Explore the E-Learning Management System Lower usage during COVID-19 Pandemic

    Get PDF
    During the COVID-19 pandemic, several universities are finding it difficult to provide and use online and e-learning systems. Blackboard, for example, is an e-learning system with various wonderful features that would be useful during the COVID-19 pandemic. However, knowing the acceptance variables as well as the primary problems that contemporary e-learning technologies confront is crucial for efficient utilization. The growing number of students attending different instructional organizations has resulted in a greater volume of material being needed in these organizations both from the academic and professional workforce and also because learning management systems and e-learning are indeed the university prospect, several more universities and colleges have accepted them. The purpose is to analyze the most popular E-learning system, the Blackboard system, and the authors suggest a learning management control system to accommodate major e-learning features. A Blackboard system is a plethora of academic perspectives, research, ideas, theories, and affective responses to the virtual learning environment. To use it, the technology acceptance model in times of crisis (TAMTC) has been developed as a way to evaluate student acceptability. The existing literature demonstrates that the field of information administration is constantly changing due to the effect of learning technologies like the blackboard system. Given their reduced utilization of the system, the data reveal a high level of student acceptability. The conclusions of this study provide important recommendations for policymakers, managers, developers, and academics, allowing them may further understand the key factors of successfully using an e-learning system during the COVID-19 epidemic

    Explore the E-Learning Management System Lower usage during COVID-19 Pandemic

    Get PDF
    During the COVID-19 pandemic, several universities are finding it difficult to provide and use online and e-learning systems. Blackboard, for example, is an e-learning system with various wonderful features that would be useful during the COVID-19 pandemic. However, knowing the acceptance variables as well as the primary problems that contemporary e-learning technologies confront is crucial for efficient utilization. The growing number of students attending different instructional organizations has resulted in a greater volume of material being needed in these organizations both from the academic and professional workforce and also because learning management systems and e-learning are indeed the university prospect, several more universities and colleges have accepted them. The purpose is to analyze the most popular E-learning system, the Blackboard system, and the authors suggest a learning management control system to accommodate major e-learning features. A Blackboard system is a plethora of academic perspectives, research, ideas, theories, and affective responses to the virtual learning environment. To use it, the technology acceptance model in times of crisis (TAMTC) has been developed as a way to evaluate student acceptability. The existing literature demonstrates that the field of information administration is constantly changing due to the effect of learning technologies like the blackboard system. Given their reduced utilization of the system, the data reveal a high level of student acceptability. The conclusions of this study provide important recommendations for policymakers, managers, developers, and academics, allowing them may further understand the key factors of successfully using an e-learning system during the COVID-19 epidemic

    The Utilisiation of composite Machine Learning models for the Classification of Medical Datasets For Sickle Cell Disease

    Get PDF
    The increase growth of health information systems has provided a significant way to deliver great change in medical domains. Up to this date, the majority of medical centres and hospitals continue to use manual approaches for determining the correct medication dosage for sickle cell disease. Such methods depend completely on the experience of medical consultants to determine accurate medication dosages, which can be slow to analyse, time consuming and stressful. The aim of this paper is to provide a robust approach to various applications of machine learning in medical domain problems. The initial case study addressed in this paper considers the classification of medication dosage levels for the treatment of sickle cell disease. This study base on different architectures of machine learning in order to maximise accuracy and performance. The leading motivation for such automated dosage analysis is to enable healthcare organisations to provide accurate therapy recommendations based on previous data. The results obtained from a range of models during our experiments have shown that a composite model, comprising a Neural Network learner, trained using the Levenberg-Marquardt algorithm, combined with a Random Forest learner, produced the best results when compared to other models with an Area under the Curve of 0.995

    A Framework to Support Ubiquitous Healthcare Monitoring and Diagnostic for Sickle Cell Disease

    Get PDF
    Recent technology advances based on smart devices have improved the medical facilities and become increasingly popular in association with real-time health monitoring and remote/personals health-care. Healthcare organisations are still required to pay more attention for some improvements in terms of cost-effectiveness and maintaining efficiency, and avoid patients to take admission at hospital. Sickle cell disease (SCD) is one of the most challenges chronic obtrusive disease that facing healthcare, affects a large numbers of people from early childhood. Currently, the vast majority of hospitals and healthcare sectors are using manual approach that depends completely on patient input, which can be slowly analysed, time consuming and stressful. This work proposes an alert system that could send instant information to the doctors once detects serious condition from the collected data of the patient. In addition, this work offers a system that can analyse datasets automatically in order to reduce error rate. A machine-learning algorithm was applied to perform the classification process. Two experiments were conducted to classify SCD patients from normal patients using machine learning algorithm in which 99 % classification accuracy was achieved using the Instance-based learning algorithm

    Eccentric connectivity index of some chemical trees

    Get PDF
    Let G = (V, E) be a simple connected molecular graph. In such a simple molecular graph, vertices represent atoms and edges represent chemical bonds, we denoted the sets of vertices and edges by V(G) and E(G), respectively. If d(u, v) be the notation of distance between vertices u, v ε V(G) and is defined as the length of a shortest path connecting them. Then, the eccentricity connectivity index of a molecular graph G is defined as ζ(G) = Σ vεv(G) deg(v)ec(v), where deg(v) is degree of a vertex v ε V(G), and is defined as the number of adjacent vertices with v. ec(v) is eccentricity of a vertex v ε V(G), and is defined as the length of a maximal path connecting to another vertex of v. In this paper, we establish the general formulas for the eccentricity connectivity index of some classes of chemical trees

    Training Neural networks for Experimental models: Classifying Biomedical Datasets for Sickle Cell Disease

    Get PDF
    This paper presents the use of various type of neural network architectures for the classification of medical data. Extensive research has indicated that neural networks generate significant improvements when used for the pre-processing of medical time-series data signals and have assisted in obtaining high accuracy in the classification of medical data. Up to date, most of hospitals and healthcare sectors in the United Kingdom are using manual approach for analysing patient input for sickle cell disease, which depends on clinician’s experience that can lead to time consuming and stress to patents. The results obtained from a range of models during our experiments have shown that the proposed Back-propagation trained feed-forward neural network classifier generated significantly better outcomes over the other range of classifiers. Using the ROC curve, experiments results showed the following outcomes for our models, in order of best to worst: Back-propagation trained feed-forward neural net classifier: 0.989, Functional Link neural Network: 0.972, in comparison to the Radial basis neural Network Classifiers with areas of 0.875, and the Voted Perception classifier: 0.766. A Linear Neural Network was used as baseline classifier to illustrate the importance of the previous models, producing an area of 0.849, followed by a random guessing model with an area of 0.524

    A Data Science Methodology Based on Machine Learning Algorithms for Flood Severity Prediction

    Get PDF
    In this paper, a novel application of machine learning algorithms including Neural Network architecture is presented for the prediction of flood severity. Floods are considered natural disasters that cause wide scale devastation to areas affected. The phenomenon of flooding is commonly caused by runoff from rivers and precipitation, specifically during periods of extremely high rainfall. Due to the concerns surrounding global warming and extreme ecological effects, flooding is considered a serious problem that has a negative impact on infrastructure and humankind. This paper attempts to address the issue of flood mitigation through the presentation of a new flood dataset, comprising 2000 annotated flood events, where the severity of the outcome is categorised according to 3 target classes, demonstrating the respective severities of floods. The paper also presents various types of machine learning algorithms for predicting flood severity and classifying outcomes into three classes, normal, abnormal, and high-risk floods. Extensive research indicates that artificial intelligence algorithms could produce enhancement when utilised for the pre-processing of flood data. These approaches helped in acquiring better accuracy in the classification techniques. Neural network architectures generally produce good outcomes in many applications, however, our experiments results illustrated that random forest classifier yields the optimal results in comparison with the benchmarked models

    Stem cell-based approaches in cardiac tissue engineering: controlling the microenvironment for autologous cells

    Get PDF
    Cardiovascular disease is one of the leading causes of mortality worldwide. Cardiac tissue engineering strategies focusing on biomaterial scaffolds incorporating cells and growth factors are emerging as highly promising for cardiac repair and regeneration. The use of stem cells within cardiac microengineered tissue constructs present an inherent ability to differentiate into cell types of the human heart. Stem cells derived from various tissues including bone marrow, dental pulp, adipose tissue and umbilical cord can be used for this purpose. Approaches ranging from stem cell injections, stem cell spheroids, cell encapsulation in a suitable hydrogel, use of prefabricated scaffold and bioprinting technology are at the forefront in the field of cardiac tissue engineering. The stem cell microenvironment plays a key role in the maintenance of stemness and/or differentiation into cardiac specific lineages. This review provides a detailed overview of the recent advances in microengineering of autologous stem cell-based tissue engineering platforms for the repair of damaged cardiac tissue. A particular emphasis is given to the roles played by the extracellular matrix (ECM) in regulating the physiological response of stem cells within cardiac tissue engineering platforms
    corecore