38 research outputs found

    Towards Clinical Decision Support for Veteran Mental Health Crisis Events using Tree Algorithm

    Get PDF
    This research focuses on establishing a psychological treatment system especially for Milwaukee based veterans outside the traditional clinical environment of Veterans Affairs (VA). As part of this process, a 12- week intervention had been made. Data had been collected related to different health aspects and psychological measurements. With the help of expert veterans and psychologist, we had defined early warning signs, acute crisis and long-term crisis from this dataset. We had used different algorithms to predict long term crisis using acute crisis and early warning signs. At the end, we had established a clinical decision-making rule to assist peer mentor veterans to help their fellow mentee veterans especially those suffering from PTSD

    Junior Class Preparedness Classification Faces A National Exam Using C.45 Algorithm with A Particle Swarm Optimization Approach

    Get PDF
    These studies are counter to a trend of falling students' graduation rates on the national exam. This is because of the way students prepare their readiness to face national tests is inaccurate. On this study the hybrid method c4 algorithm.5 and the swarm particle optimization to produce a class readiness of students with high and accurate accuracy. This research suggests that by using hybridmethodC4.5 andParticle Swarm Optimizationgenerates accuracy as 97.13 %,  Precisionas 96,58 %, andRecallas 100 %. Then implemented through a web-based prototype application using programming javascriptlanguage

    DECISION TREE SIMPLIFICATION THROUGH FEATURE SELECTION APPROACH IN SELECTING FISH FEED SELLERS

    Get PDF
    Feed is a crucial variable because it can determine the success of fish farming. Breeders can use two types of artificial feed, namely alternative feed and pellets. Many cultivators need pellets as the main consumption for the fish they are cultivating because the pellets contain a composition that has been adjusted to their needs based on the type and age of the fish. However, currently, cultivators are facing a problem, namely the high price of fish pellets on the market. Therefore, an analysis of the classification of the selection of fish feed sellers is needed that is adjusted to several criteria like the number of types of feed, price, order, delivery, and availability of discounts. This study conducted a classification analysis of simplification of characteristics in selecting fish feed sellers in Kendal Regency that would then be compared with a model without feature selection by utilizing the Decision Tree C4.5 method. The results of this study are the decision tree with the best performance where C4.5 with the application of the selected feature has an accuracy value of 92%, while C4.5 without the selection feature has an accuracy of 86.8%. The results of this study indicate that the C4.5 method with the application of selection features is better than C4.5 without selection features so that it can be applied to the selection of freshwater fish feed sellers in Kendal Regency

    Mind the Gap: Technology as a Lifeline for Pro Se Child Custody Appeals

    Get PDF
    As the justice gap continues to grow, and because there is no fed-eral constitutional right to counsel in civil cases, there is an ongoingneed to develop solutions to assist those who cannot afford attor-neys to navigate the difficult procedural issues associated withtheir legal matters. Appellate procedure is difficult to comply witheven when a person has legal training, and for the pro se litigant itcan be particularly difficult to articulate a meritorious claim anddraft the documents required to initiate an appeal. Failure to com-ply with the procedural requirements for an appeal can result in the appellate court finding waiver or even dismissing the case prior to it being heard on the merits. Artificial intelligence systems and technology have been identified as a means to help close the justice gap. Through a case vignette, this article will explore the need for additional options to help close the justice gap and will exemplify how technology can assist with the justice gap by presenting an application designed to assist pro se litigants in the creation of the initiating documents for Pennsylvania child custody appeals

    Data-Driven Approaches Tests on A Laboratory Drilling System

    Get PDF
    In recent years, considerable resources have been invested to exploit vast amounts of data that get collected during exploration, drilling and production of oil and gas. Data-related digital technologies potentially become a game changer for the industry in terms of reduced costs through increasing operational efficiency and avoiding accidents, improved health, safety and environment through strengthening situational awareness and so on. Machine learning, an application of artificial intelligence to offer systems/processes self-learning and self-driving ability, has been around for recent decades. In the last five to ten years, the increased computational powers along with heavily digitized control and monitoring systems have made machine learning algorithms more available, powerful and accurate. Considering the state-of-art technologies that exist today and the significant resources that are being invested into the technologies of tomorrow, the idea of intelligent and automated drilling systems to select best decisions or provide good recommendations based on the information available becomes closer to a reality. This study shows the results of our research activity carried out on the topic of drilling automation and digitalization. The main objective is to test the developed machine learning algorithms of formation classification and drilling operations identification on a laboratory drilling system. In this paper, an algorithm to develop data-driven models based on the laboratory data collocated in many scenarios (for instance, drilling different formation samples with varying drilling operational parameters and running different operations) is presented. Moreover, a testing algorithm based on data-driven models for new formation detection and confirmation is proposed. In the case study, results on multiple experiments conducted to test and validate the developed machine learning methods have been illustrated and discussed.publishedVersio

    Energy consumption modelling using deep learning technique — a case study of EAF

    Get PDF
    Energy consumption is a global issue which government is taking measures to reduce. Steel plant can have a better energy management once its energy consumption can be modelled and predicted. The purpose of this study is to establish an energy value prediction model for electric arc furnace (EAF) through a data-driven approach using a large amount of real-world data collected from the melt shop in an established steel plant. The data pre-processing and feature selection are carried out. Several data mining algorithms are used separately to build the prediction model. The result shows the predicting performance of the deep learning model is better than the conventional machine learning models, e.g., linear regression, support vector machine and decision tree

    Understanding cost-utility analysis studies in the trauma and orthopaedic surgery literature.

    Get PDF
    Cost-utility analysis (CUA) studies are becoming increasingly important due to the need to reduce healthcare spending, especially in the field of trauma and orthopaedics.There is an increasing need for trauma and orthopaedic surgeons to understand these economic evaluations to ensure informed cost-effective decisions can be made to benefit the patient and funding body.This review discusses the fundamental principles required to understand CUA studies in the literature, including a discussion of the different methods employed to assess the health outcomes associated with different management options and the various approaches used to calculate the costs involved.Different types of model design may be used to conduct a CUA which can be broadly categorized into real-life clinical studies and computer-simulated modelling. We discuss the main types of study designs used within each category. We also cover the different types of sensitivity analysis used to quantify uncertainty in these studies and the commonly employed instruments used to assess the quality of CUAs. Finally, we discuss some of the important limitations of CUAs that need to be considered.This review outlines the main concepts required to understand the CUA literature and provides a basic framework for their future conduct. Cite this article: EFORT Open Rev 2021;6:305-315. DOI: 10.1302/2058-5241.6.200115

    Segmentation of PMSE data using random forests

    Get PDF
    EISCAT VHF radar data are used for observing, monitoring, and understanding Earth’s upper atmosphere. This paper presents an approach to segment Polar Mesospheric Summer Echoes (PMSE) from datasets obtained from EISCAT VHF radar data. The data consist of 30 observations days, corresponding to 56,250 data samples. We manually labeled the data into three different categories: PMSE, Ionospheric background, and Background noise. For segmentation, we employed random forests on a set of simple features. These features include: altitude derivative, time derivative, mean, median, standard deviation, minimum, and maximum values corresponding to neighborhood sizes ranging from 3 by 3 to 11 by 11 pixels. Next, in order to reduce the model bias and variance, we employed a method that decreases the weight applied to pixel labels with large uncertainty. Our results indicate that, first, it is possible to segment PMSE from the data using random forests. Second, the weighted-down labels technique improves the performance of the random forests method

    DATA ANALYSIS AND PREDICTIVE MODEL GENERATION FOR DELAYS IN NAVY CONSTRUCTION PROJECTS

    Get PDF
    Currently, Naval Facilities Engineering Command (NAVFAC) records all data on the process from application to awarding of Military Construction (MILCON) projects. This data is not utilized to increase poor performance and lack of timely results on the completion of MILCON projects. The poor performance leads to delays in deliveries to important facilities and delays in warship deployment and degradation of warfighting capabilities. NAVFAC currently has personnel investigating methods on improving the project timelines to minimize delays. Majority of the delays occur during the pre-award phase of the projects with the post-award phase causing additional delays. The purpose of this thesis is to analyze projects across multiple fiscal years from project initiation to contract award. To accomplish this, data was acquired from NAVFAC’s eProjects database and analyzed using machine learning techniques as well as statistical analysis to determine a correlation between the possible causes and the delays that occurred to develop a predictive model for analyzing future project contract delays. This collection will potentially assist NAVFAC in focusing onto ongoing improvements. Reducing the delays in project awarding will further the process for reducing the overall time required to complete MILCON projects. This will shorten the amount of time that ships are in the shipyard further enhancing the Navy’s undersea warfare capabilities with more submarines and other assets deployed.Lieutenant, United States NavyApproved for public release. Distribution is unlimited

    A review of the enabling methodologies for knowledge discovery from smart grids data

    Get PDF
    The large-scale deployment of pervasive sensors and decentralized computing in modern smart grids is expected to exponentially increase the volume of data exchanged by power system applications. In this context, the research for scalable and flexible methodologies aimed at supporting rapid decisions in a data rich, but information limited environment represents a relevant issue to address. To this aim, this paper investigates the role of Knowledge Discovery from massive Datasets in smart grid computing, exploring its various application fields by considering the power system stakeholder available data and knowledge extraction needs. In particular, the aim of this paper is dual. In the first part, the authors summarize the most recent activities developed in this field by the Task Force on “Enabling Paradigms for High-Performance Computing in Wide Area Monitoring Protective and Control Systems” of the IEEE PSOPE Technologies and Innovation Subcommittee. Differently, in the second part, the authors propose the development of a data-driven forecasting methodology, which is modeled by considering the fundamental principles of Knowledge Discovery Process data workflow. Furthermore, the described methodology is applied to solve the load forecasting problem for a complex user case, in order to emphasize the potential role of knowledge discovery in supporting post processing analysis in data-rich environments, as feedback for the improvement of the forecasting performances
    corecore