479 research outputs found

    Optimizing Logistic Regression Coefficients for Discrimination and Calibration Using Estimation of Distribution Algorithms.

    Get PDF
    Logistic regression is a simple and efficient supervised learning algorithm for estimating the probability of an outcome or class variable. In spite of its simplicity, logistic regression has shown very good performance in a range of fields. It is widely accepted in a range of fields because its results are easy to interpret. Fitting the logistic regression model usually involves using the principle of maximum likelihood. The Newton–Raphson algorithm is the most common numerical approach for obtaining the coefficients maximizing the likelihood of the data. This work presents a novel approach for fitting the logistic regression model based on estimation of distribution algorithms (EDAs), a tool for evolutionary computation. EDAs are suitable not only for maximizing the likelihood, but also for maximizing the area under the receiver operating characteristic curve (AUC). Thus, we tackle the logistic regression problem from a double perspective: likelihood-based to calibrate the model and AUC-based to discriminate between the different classes. Under these two objectives of calibration and discrimination, the Pareto front can be obtained in our EDA framework. These fronts are compared with those yielded by a multiobjective EDA recently introduced in the literature

    DEVELOPING A CLINICAL LINGUISTIC FRAMEWORK FOR PROBLEM LIST GENERATION FROM CLINICAL TEXT

    Get PDF
    Regulatory institutions such as the Institute of Medicine and Joint Commission endorse problem lists as an effective method to facilitate transitions of care for patients. In practice, the problem list is a common model for documenting a care provider's medical reasoning with respect to a problem and its status during patient care. Although natural language processing (NLP) systems have been developed to support problem list generation, encoding many information layers - morphological, syntactic, semantic, discourse, and pragmatic - can prove computationally expensive. The contribution of each information layer for accurate problem list generation has not been formally assessed. We would expect a problem list generator that relies on natural language processing would improve its performance with the addition of rich semantic features We hypothesize that problem list generation can be approached as a two-step classification problem - problem mention status (Aim One) and patient problem status (Aim Two) classification. In Aim One, we will automatically classify the status of each problem mention using semantic features about problems described in the clinical narrative. In Aim Two, we will classify active patient problems from individual problem mentions and their statuses. We believe our proposal is significant in two ways. First, our experiments will develop and evaluate semantic features, some commonly modeled and others not in the clinical text. The annotations we use will be made openly available to other NLP researchers to encourage future research on this task and other related problems including foundational NLP algorithms (assertion classification and coreference resolution) and applied clinical applications (patient timeline and record visualization). Second, by generating and evaluating existing NLP systems, we are building an open-source problem list generator and demonstrating the performance for problem list generation using these features

    Learning from Teacher's Eye Movement: Expertise, Subject Matter and Video Modeling

    Full text link
    How teachers' eye movements can be used to understand and improve education is the central focus of the present paper. Three empirical studies were carried out to understand the nature of teachers' eye movements in natural settings and how they might be used to promote learning. The studies explored 1) the relationship between teacher expertise and eye movement in the course of teaching, 2) how individual differences and the demands of different subjects affect teachers' eye movement during literacy and mathematics instruction, 3) whether including an expert's eye movement and hand information in instructional videos can promote learning. Each study looked at the nature and use of teacher eye movements from a different angle but collectively converge on contributions to answering the question: what can we learn from teachers' eye movements? The paper also contains an independent methodology chapter dedicated to reviewing and comparing methods of representing eye movements in order to determine a suitable statistical procedure for representing the richness of current and similar eye tracking data. Results show that there are considerable differences between expert and novice teachers' eye movement in a real teaching situation, replicating similar patterns revealed by past studies on expertise and gaze behavior in athletics and other fields. This paper also identified the mix of person-specific and subject-specific eye movement patterns that occur when the same teacher teaches different topics to the same children. The final study reports evidence that eye movement can be useful in teaching; by showing increased learning when learners saw an expert model's eye movement in a video modeling example. The implications of these studies regarding teacher education and instruction are discussed.PHDEducation & PsychologyUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/145853/1/yizhenh_1.pd

    Simulation of Healthcare Processes: Challenges, Solutions, and Benefits

    Get PDF
    The emergency department of a hospital plays a key role in the incoming patient management. Therefore, it is crucial to ensure an adequate level of organization and efficiency. In this project, we aim to analyze and improve the processes at emergency departments using business process simulation. Through simulation experiments, various ‘what-if’ scenarios can be tested, and redesigning alternatives can be compared with respect to some key performance indicators. The input for business process simulation is a process model extended with additional information for a probabilistic characterization of the different run-time aspects (case arrival rate, task durations, routing probabilities, roles, etc.). It is thus critical that the business simulation model is accurate so as to ensure that the simulations and the various ‘what if’ scenarios reflect credible alternatives with realistic outcomes. This project aims to create emergency-department simulation models on the basis of the actual executions that are recorded in so-called event logs, which are typically extracted from the information systems that support the execution of processes at the hospital. The process model and the simulation parameters are extracted using different techniques from the field of Process Mining, which builds on an analysis of the event logs and aims to gain insights into how processes are actually carried out. In particular, this project has focused on the analysis of the emergency department at a hospital in Tuscany. The quality of the extracted logs greatly influences the analysis that can be carried out. For example, to tackle the reduction of waiting times for patients and to optimize the resources, two of the most critical emergency-department challenges, it is necessary that the data report when each activity has started and completed, and how hospital staff participated to the execution of each activity. Unfortunately, the event logs extracted from the information systems of the Tuscany’s hospital missed relevant information, including the timestamps when activity started, thus diminishing the realism of the business simulation model. To overtake this issue, the project extended a previous technique to estimate the missing timestamps of when the activities started, and overtook some of its limitations. The project assessed the extended technique on the emergency department of the Tuscany’s hospital, and has shown on this case study how healthcare can leverage on process mining to simulate different ‘what-if’ scenarios, on the basis of which decisions can be made on how to improve real processes

    Volumetric analysis of plexiform neurofibroma for patients treated with trametinib

    Full text link
    Des recherches récentes suggèrent que le trametinib (inhibiteur de MEK) peut traiter les neurofibrome plexiforme (NP) provoquées par la neurofibromatose de type 1 (NF1). Les NP peuvent apparaître n'importe où dans le corps près des nerfs. Ces tumeurs se distinguent par leur forme inhabituelle et leur morphologie irrégulière qui les rendent difficile à mesurer. Pour évaluer l'efficacité du trametinib dans le traitement des NP, nous suggérons une analyse volumétrique (mesure 3D) plutôt que des mesures 1D et 2D (habituelles) basées sur l'imagerie par résonance magnétique (IRM). Pour cette étude, des examens IRM ont été réalisés à des intervalles d'environ trois mois pour trente-quatre patients atteints de NP. J’ai développé une méthode semi-automatique pour segmenter les PN sur les images IRM. J’ai testé et validé notre nouvelle approche et soumis un manuscrit incluant la description de la nouvelle méthodologie et les résultats de segmentation pour publication dans l'American Journal of Neuroradiology (AJNR). J’ai mis en place un outil pratique pour estimer avec précision le volume tumoral en utilisant cette méthode de segmentation. En conséquence, le suivi des changements tout au long du traitement devient possible et fiable. L'analyse volumétrique réalisée chez 34 participants recrutés durant l’essai clinique révèle que le trametinib a entrainer une diminution du volume médian de la lésion initiale d'environ 20 % pour la période de 18 mois de traitement.Recent research suggests that the medication trametinib can treat plexiform neurofibroma (PN) lesions associated with neurofibromatosis type 1 (NF1) disease. PNs can appear anywhere in the body near nerves. These tumors are distinct by their unusual shape and irregular morphology, which is difficult to assess. For evaluating trametinib's effectiveness in treating PN, we suggest a volumetric analysis (3D measurement) rather than 1D and 2D measures (typical) based on magnetic resonance imaging (MRI). For this study, MRI scans were performed at about three-month intervals for thirty-four patients with PN. I developed a semi-automatic method to segment PNs on MRI images. I tested and validated our new approach and submitted a manuscript with the description of the novel methodology and findings for publication in the American Journal of Neuroradiology (AJNR). I implemented a practical tool for accurately estimating tumor volume using this segmentation method. As a result, tracking lesion changes throughout the course of therapy becomes available. The volumetric analysis performed on 34 patients enrolled in the clinical trial reveals that trametinib decreased the initial median lesion volume by around 20% for the period of 18 months of treatment

    Optimizing Outcomes of Colorectal Cancer Screening

    Get PDF
    Colorectal cancer is a leading cause of cancer deaths. Screening for colorectal cancer is implemented in an increasing number of settings, but performance of programs is often suboptimal. In this thesis, advanced modeling, informed by empirical data, was used to identify areas for improvement of screening programs. The thesis includes studies on the effect of the test used for screening, long-term adherence with screening, the quality of colorectal examinations, time to diagnostic examination, and risk-stratified screening

    Healthcare Logistics: the art of balance

    Get PDF
    Healthcare management is a very complex and demanding business. The pro - cesses involved – operational, tactical and strategic – are extremely divers, sophisticated, and we see medical-technological advancements following on each other’s heels at breathtaking speed. And then there is the constant great pressure exerted from many sides: ever-increasing needs and demands from patients and society, thinking about organizations, growing competition, necessity to incorporate these rapidly succeeding medical-technological advancements into the organization, strict cost containment, growing demand for healthcare, and a constant tightening of budgets. These developments force healthcare managers in the individual organizations to find a balance between said developments, the feasibilities of organization in question, and the desired healthcare outcomes in an ever-changing world. The search for individual organizational balances requires that the world of professional competencies, i.e. the clinicians, and the world of healthcare managers should speak the same language when weighing the various developments and translating the outcomes into organizational choices. For the clinicians to make the right choices they must be facilitated to appraise the effects of their choices on organizational outcomes. Likewise, the healthcare managers’ decision- making process should include the effects on the medical policies pursued by the individual clinicians in the own organization. This thesis places a focus on developing methods for allocation of hospital resources within a framework that enables clinicians and healthcare managers to balance the developments on the various levels, thus providing a basis for policymaking
    • …
    corecore