864 research outputs found

    Improving Palliative Care with Deep Learning

    Full text link
    Improving the quality of end-of-life care for hospitalized patients is a priority for healthcare organizations. Studies have shown that physicians tend to over-estimate prognoses, which in combination with treatment inertia results in a mismatch between patients wishes and actual care at the end of life. We describe a method to address this problem using Deep Learning and Electronic Health Record (EHR) data, which is currently being piloted, with Institutional Review Board approval, at an academic medical center. The EHR data of admitted patients are automatically evaluated by an algorithm, which brings patients who are likely to benefit from palliative care services to the attention of the Palliative Care team. The algorithm is a Deep Neural Network trained on the EHR data from previous years, to predict all-cause 3-12 month mortality of patients as a proxy for patients that could benefit from palliative care. Our predictions enable the Palliative Care team to take a proactive approach in reaching out to such patients, rather than relying on referrals from treating physicians, or conduct time consuming chart reviews of all patients. We also present a novel interpretation technique which we use to provide explanations of the model's predictions.Comment: IEEE International Conference on Bioinformatics and Biomedicine 201

    Three-Dimensional Crystallographic Reconstruction for Atomic Resolution

    Get PDF
    Three-dimensional structures have recently been determined by electron crystallography at a resolution high enough to determine atomic arrangements in both protein and mineral specimens. The different nature of these two types of specimens produces some very significant differences in the way data is obtained and processed, although the principles are the same. The sensitivity of proteins to damage by the electron beam limits the signal-to-noise ratio in the image and the resolution to which data can be extracted from the image. A number of constraints, such as the amino acid sequence and the connectivity of atoms within amino acids, can be used in interpreting the limited image data. In materials samples, the relative insensitivity to damage allows obtaining resolution limited only by the microscope. In many samples, dynamical scattering and other non-linear effects limit the information in the image, but this limit can be circumvented by working in very thin areas of the specimen

    Cognitive therapy and spirituality: the battleground and the blend

    Get PDF
    Includes bibliographical references

    Restoration of Weak Phase-Contrast Images Recorded With a High Degree of Defocus: The "Twin Image" Problem Associated With CTF Correction

    Get PDF
    Relatively large values of objective-lens defocus must normally be used to produce detectable levels of image contrast for unstained biological specimens, which are generally weak phase objects. As a result, a subsequent restoration operation must be used to correct for oscillations in the contrast transfer function (CTF) at higher resolution. Currently used methods of CTF-correction assume the ideal case in which Friedel mates in the scattered wave have contributed pairs of Fourier components that overlap with one another in the image plane. This"ideal" situation may be only poorly satisfied, or not satisfied at all, as the particle size gets smaller, the defocus value gets larger, and the resolution gets higher. We have therefore investigated whether currently used methods of CTF correction are also effective in restoring the single-sideband image information that becomes displaced (delocalized) by half (or more) the diameter of a particle of finite size. Computer simulations are used to show that restoration either by"phase flipping" or by multiplying by the CTF recovers only about half of the delocalized information. The other half of the delocalized information goes into a doubly defocused"twin" image of the type produced during optical reconstruction of an in-line hologram. Restoration with a Wiener filter is effective in recovering the delocalized information only when the signal-to-noise ratio (S/N) is orders of magnitude higher than that which exists in low-dose images of biological specimens, in which case the Wiener filter approaches division by the CTF (i.e. the formal inverse). For realistic values of the S/N, however, the"twin image" problem seenwith a Wiener filter is very similar to that seen when either phase flipping or multiplying by the CTF are used for restoration. The results of these simulations suggest that CTF correction is a poor alternative to using a Zernike-type phase plate when imaging biological specimens, in which case the images can be recorded in a close-to-focus condition, and delocalization of high-resolution information is thus minimized

    U.S. Billion-ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    Get PDF
    The Report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of “potential” biomass within the contiguous United States based on numerous assumptions about current and future inventory and production capacity, availability, and technology. In the 2005 BTS, a strategic analysis was undertaken to determine if U.S. agriculture and forest resources have the capability to potentially produce at least one billion dry tons of biomass annually, in a sustainable manner—enough to displace approximately 30% of the country’s present petroleum consumption. To ensure reasonable confidence in the study results, an effort was made to use relatively conservative assumptions. However, for both agriculture and forestry, the resource potential was not restricted by price. That is, all identified biomass was potentially available, even though some potential feedstock would more than likely be too expensive to actually be economically available. In addition to updating the 2005 study, this report attempts to address a number of its shortcoming
    corecore