2,926 research outputs found

    Accurate multimodal probabilistic prediction of conversion to Alzheimer's disease in patients with mild cognitive impairment

    Get PDF
    Accurately identifying the patients that have mild cognitive impairment (MCI) who will go on to develop Alzheimer's disease (AD) will become essential as new treatments will require identification of AD patients at earlier stages in the disease process. Most previous work in this area has centred around the same automated techniques used to diagnose AD patients from healthy controls, by coupling high dimensional brain image data or other relevant biomarker data to modern machine learning techniques. Such studies can now distinguish between AD patients and controls as accurately as an experienced clinician. Models trained on patients with AD and control subjects can also distinguish between MCI patients that will convert to AD within a given timeframe (MCI-c) and those that remain stable (MCI-s), although differences between these groups are smaller and thus, the corresponding accuracy is lower. The most common type of classifier used in these studies is the support vector machine, which gives categorical class decisions. In this paper, we introduce Gaussian process (GP) classification to the problem. This fully Bayesian method produces naturally probabilistic predictions, which we show correlate well with the actual chances of converting to AD within 3 years in a population of 96 MCI-s and 47 MCI-c subjects. Furthermore, we show that GPs can integrate multimodal data (in this study volumetric MRI, FDG-PET, cerebrospinal fluid, and APOE genotype with the classification process through the use of a mixed kernel). The GP approach aids combination of different data sources by learning parameters automatically from training data via type-II maximum likelihood, which we compare to a more conventional method based on cross validation and an SVM classifier. When the resulting probabilities from the GP are dichotomised to produce a binary classification, the results for predicting MCI conversion based on the combination of all three types of data show a balanced accuracy of 74%. This is a substantially higher accuracy than could be obtained using any individual modality or using a multikernel SVM, and is competitive with the highest accuracy yet achieved for predicting conversion within three years on the widely used ADNI dataset

    Balloon Atrial Septostomy as Initial Therapy in Pediatric Pulmonary Hypertension

    Get PDF
    Balloon atrial septostomy is a palliative procedure currently used to bridge medically refractory pulmonary hypertension patients to lung transplantation. In the current report, we present balloon atrial septostomy as an initial therapy for high-risk pediatric pulmonary hypertension patients at our institution. Nineteen patients with median age of 4.3 years (range 0.1-14.3 years) underwent balloon atrial septostomy during initial admission for pulmonary hypertension. There were no procedural complications or deaths within 24 h of balloon atrial septostomy. Patients were followed for a median of 2.6 years (interquartile range 1.0-4.8 years). Three (16%) patients died, 3 (16%) underwent lung transplantation, and 1 (5%) underwent reverse Potts shunt. Transplant-free survival at 30 days, 1 year, and 3 years was 84%, 76%, and 67% respectively. This single-center experience suggests early-BAS in addition to pharmacotherapy is safe and warrants consideration in high-risk pediatric pulmonary hypertension patients

    Herbivore Preference for Afternoon- and Morning-Cut Forages and Adoption of Cutting Management Strategies

    Get PDF
    Photosynthesizing forage plants accumulate total nonstructural carbohydrates (TNC) during daylight, but then TNC concentrations are reduced during the night. Afternoon-cut forage (PM) has greater TNC value and thus economic value, than morning-cut (AM). Livestock prefer PM-cut hay and this can be readily demonstrated by offering animals a choice of hays cut in PM and AM. Alfalfa growers in the western United States are readily adopting PM-cutting technology to increase profits

    The Discovery of Argon in Comet C/1995 O1 (Hale-Bopp)

    Get PDF
    On 30.14 March 1997 we observed the EUV spectrum of the bright comet C/1995 O1 (Hale-Bopp) at the time of its perihelion, using our EUVS sounding rocket telescope/spectrometer. The spectra reveal the presence H Ly beta, O+, and, most notably, Argon. Modelling of the retrieved Ar production rates indicates that comet Hale-Bopp is enriched in Ar relative to cosmogonic expectations. This in turn indicates that Hale-Bopp's deep interior has never been exposed to the 35-40 K temperatures necessary to deplete the comet's primordial argon supply.Comment: 9 pages, 2 figures. ApJ, 545, in press (2000

    A gray-box model for a probabilistic estimate of regional ground magnetic perturbations: Enhancing the NOAA operational Geospace model with machine learning

    Get PDF
    We present a novel algorithm that predicts the probability that the time derivative of the horizontal component of the ground magnetic field dB/dtdB/dt exceeds a specified threshold at a given location. This quantity provides important information that is physically relevant to Geomagnetically Induced Currents (GIC), which are electric currents { associated to} sudden changes in the Earth's magnetic field due to Space Weather events. The model follows a 'gray-box' approach by combining the output of a physics-based model with machine learning. Specifically, we combine the University of Michigan's Geospace model that is operational at the NOAA Space Weather Prediction Center, with a boosted ensemble of classification trees. We discuss the problem of re-calibrating the output of the decision tree to obtain reliable probabilities. The performance of the model is assessed by typical metrics for probabilistic forecasts: Probability of Detection and False Detection, True Skill Statistic, Heidke Skill Score, and Receiver Operating Characteristic curve. We show that the ML enhanced algorithm consistently improves all the metrics considered.Comment: under revie

    A comprehensive analysis of methods for assessing polygenic burden on Alzheimer’s disease pathology and risk beyond APOE

    Get PDF
    Genome-wide association studies have identified dozens of loci that alter the risk to develop Alzheimer’s disease. However, with the exception of the APOE-ε4 allele, most variants bear only little individual effect and have, therefore, limited diagnostic and prognostic value. Polygenic risk scores aim to collate the disease risk distributed across the genome in a single score. Recent works have demonstrated that polygenic risk scores designed for Alzheimer’s disease are predictive of clinical diagnosis, pathology confirmed diagnosis and changes in imaging biomarkers. Methodological innovations in polygenic risk modelling include the polygenic hazard score, which derives effect estimates for individual single nucleotide polymorphisms from survival analysis, and methods that account for linkage disequilibrium between genomic loci. In this work, using data from the Alzheimer’s disease neuroimaging initiative, we compared different approaches to quantify polygenic disease burden for Alzheimer’s disease and their association (beyond the APOE locus) with a broad range of Alzheimer’s disease-related traits: cross-sectional CSF biomarker levels, cross-sectional cortical amyloid burden, clinical diagnosis, clinical progression, longitudinal loss of grey matter and longitudinal decline in cognitive function. We found that polygenic scores were associated beyond APOE with clinical diagnosis, CSF-tau levels and, to a minor degree, with progressive atrophy. However, for many other tested traits such as clinical disease progression, CSF amyloid, cognitive decline and cortical amyloid load, the additional effects of polygenic burden beyond APOE were of minor nature. Overall, polygenic risk scores and the polygenic hazard score performed equally and given the ease with which polygenic risk scores can be derived; they constitute the more practical choice in comparison with polygenic hazard scores. Furthermore, our results demonstrate that incomplete adjustment for the APOE locus, i.e. only adjusting for APOE-ε4 carrier status, can lead to overestimated effects of polygenic scores due to APOE-ε4 homozygous participants. Lastly, on many of the tested traits, the major driving factor remained the APOE locus, with the exception of quantitative CSF-tau and p-tau measures

    Spatial Correlation Function of X-ray Selected AGN

    Full text link
    We present a detailed description of the first direct measurement of the spatial correlation function of X-ray selected AGN. This result is based on an X-ray flux-limited sample of 219 AGN discovered in the contiguous 80.7 deg^2 region of the ROSAT North Ecliptic Pole (NEP) Survey. Clustering is detected at the 4 sigma level at comoving scales in the interval r = 5-60 h^-1 Mpc. Fitting the data with a power law of slope gamma=1.8, we find a correlation length of r_0 = 7.4 (+1.8, -1.9) h^-1 Mpc (Omega_M=0.3, Omega_Lambda=0.7). The median redshift of the AGN contributing to the signal is z_xi=0.22. This clustering amplitude implies that X-ray selected AGN are spatially distributed in a manner similar to that of optically selected AGN. Furthermore, the ROSAT NEP determination establishes the local behavior of AGN clustering, a regime which is poorly sampled in general. Combined with high-redshift measures from optical studies, the ROSAT NEP results argue that the AGN correlation strength essentially does not evolve with redshift, at least out to z~2.2. In the local Universe, X-ray selected AGN appear to be unbiased relative to galaxies and the inferred X-ray bias parameter is near unity, b_X~1. Hence X-ray selected AGN closely trace the underlying mass distribution. The ROSAT NEP AGN catalog, presented here, features complete optical identifications and spectroscopic redshifts. The median redshift, X-ray flux, and X-ray luminosity are z=0.41, f_X=1.1*10^-13 cgs, and L_X=9.2*10^43 h_70^-2 cgs (0.5-2.0 keV), respectively. Unobscured, type 1 AGN are the dominant constituents (90%) of this soft X-ray selected sample of AGN.Comment: 17 pages, 8 figures, accepted for publication in ApJ, a version with high-resolution figures is available at http://www.eso.org/~cmullis/papers/Mullis_et_al_2004b.ps.gz, a machine-readable version of the ROSAT NEP AGN catalog is available at http://www.eso.org/~cmullis/research/nep-catalog.htm

    Rocket FUV Observations of the Io Plasma Torus During the Shoemaker-Levy/9 Impacts

    Get PDF
    We observed the Io torus from 820-1140 A on universal time (UT) 20.25 July 1994 from a sounding rocket telescope/spectrograph. These observations serve as only the fourth published spectrum of the torus in this wavelength range, and the only far ultraviolet (FUV) data documenting the state of the torus during the Shoemaker Levy 9 Impacts

    Statistical methods in cosmology

    Full text link
    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.Comment: 31, pages, 6 figures, notes from 2nd Trans-Regio Winter school in Passo del Tonale. To appear in Lectures Notes in Physics, "Lectures on cosmology: Accelerated expansion of the universe" Feb 201
    • …
    corecore