217,971 research outputs found

    Improving Palliative Care with Deep Learning

    Full text link
    Improving the quality of end-of-life care for hospitalized patients is a priority for healthcare organizations. Studies have shown that physicians tend to over-estimate prognoses, which in combination with treatment inertia results in a mismatch between patients wishes and actual care at the end of life. We describe a method to address this problem using Deep Learning and Electronic Health Record (EHR) data, which is currently being piloted, with Institutional Review Board approval, at an academic medical center. The EHR data of admitted patients are automatically evaluated by an algorithm, which brings patients who are likely to benefit from palliative care services to the attention of the Palliative Care team. The algorithm is a Deep Neural Network trained on the EHR data from previous years, to predict all-cause 3-12 month mortality of patients as a proxy for patients that could benefit from palliative care. Our predictions enable the Palliative Care team to take a proactive approach in reaching out to such patients, rather than relying on referrals from treating physicians, or conduct time consuming chart reviews of all patients. We also present a novel interpretation technique which we use to provide explanations of the model's predictions.Comment: IEEE International Conference on Bioinformatics and Biomedicine 201

    Predicting Flows of Rarefied Gases

    Get PDF
    DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation

    Creating Small Area Income Deprivation Estimates For Northern Ireland: Spatial Microsimulation Modelling

    Get PDF
    This paper describes results from a preliminary investigation of the value of a spatial microsimulation technique in the estimation, for each SOA in Northern Ireland, of the incidence of income poverty as measured by the proportion of households whose income is below 60% of the UK median household income (%HBAI). In this paper we describe the spatial microsimulation approach and then present small area (SOA) estimates of median household income validated against the equivalent measure NIMDM 2005 income domain score, and the Experian 2005 median income estimates. We then turn to the %HBAI estimates and describe firstly results based on unequivalised gross income for 2004/5 using the FRS 2004/5 and the UK Census 2001. We then discuss results equivalised net household income before housing costs for 2003/5 using the UK Census 2001 and a pooled 2003/4 and 2004/5 FRS dataset. We discuss the results of validation against the source FRS and against the NIMDM income domain score. The paper concludes with a summary of the findings and recommendations for further work

    Improved Lower Bounds for Constant GC-Content DNA Codes

    Full text link
    The design of large libraries of oligonucleotides having constant GC-content and satisfying Hamming distance constraints between oligonucleotides and their Watson-Crick complements is important in reducing hybridization errors in DNA computing, DNA microarray technologies, and molecular bar coding. Various techniques have been studied for the construction of such oligonucleotide libraries, ranging from algorithmic constructions via stochastic local search to theoretical constructions via coding theory. We introduce a new stochastic local search method which yields improvements up to more than one third of the benchmark lower bounds of Gaborit and King (2005) for n-mer oligonucleotide libraries when n <= 14. We also found several optimal libraries by computing maximum cliques on certain graphs.Comment: 4 page
    corecore