55 research outputs found

    Utilising artificial intelligence to determine patients at risk of a rare disease : idiopathic pulmonary arterial hypertension

    Get PDF
    Idiopathic pulmonary arterial hypertension is a rare and life-shortening condition often diagnosed at an advanced stage. Despite increased awareness, the delay to diagnosis remains unchanged. This study explores whether a predictive model based on healthcare resource utilisation can be used to screen large populations to identify patients at high risk of idiopathic pulmonary arterial hypertension. Hospital Episode Statistics from the National Health Service in England, providing close to full national coverage, were used as a measure of healthcare resource utilisation. Data for patients with idiopathic pulmonary arterial hypertension from the National Pulmonary Hypertension Service in Sheffield were linked to pre-diagnosis Hospital Episode Statistics records. A non-idiopathic pulmonary arterial hypertension control cohort was selected from the Hospital Episode Statistics population. Patient history was limited to ≤5 years pre-diagnosis. Information on demographics, timing/frequency of diagnoses, medical specialities visited and procedures undertaken was captured. For modelling, a bagged gradient boosting trees algorithm was used to discriminate between cohorts. Between 2008 and 2016, 709 patients with idiopathic pulmonary arterial hypertension were identified and compared with a stratified cohort of 2,812,458 patients classified as non-idiopathic pulmonary arterial hypertension with ≥1 ICD-10 coded diagnosis of relevance to idiopathic pulmonary arterial hypertension. A predictive model was developed and validated using cross-validation. The timing and frequency of the clinical speciality seen, secondary diagnoses and age were key variables driving the algorithm’s performance. To identify the 100 patients at highest risk of idiopathic pulmonary arterial hypertension, 969 patients would need to be screened with a specificity of 99.99% and sensitivity of 14.10% based on a prevalence of 5.5/million. The positive predictive and negative predictive values were 10.32% and 99.99%, respectively. This study highlights the potential application of artificial intelligence to readily available real-world data to screen for rare diseases such as idiopathic pulmonary arterial hypertension. This algorithm could provide low-cost screening at a population level, facilitating earlier diagnosis, improved diagnostic rates and patient outcomes. Studies to further validate this approach are warranted

    Planck 2015 results. XIV. Dark energy and modified gravity

    Get PDF
    We study the implications of Planck data for models of dark energy (DE) and modified gravity (MG), beyond the cosmological constant scenario. We start with cases where the DE only directly affects the background evolution, considering Taylor expansions of the equation of state, principal component analysis and parameterizations related to the potential of a minimally coupled DE scalar field. When estimating the density of DE at early times, we significantly improve present constraints. We then move to general parameterizations of the DE or MG perturbations that encompass both effective field theories and the phenomenology of gravitational potentials in MG models. Lastly, we test a range of specific models, such as k-essence, f(R) theories and coupled DE. In addition to the latest Planck data, for our main analyses we use baryonic acoustic oscillations, type-Ia supernovae and local measurements of the Hubble constant. We further show the impact of measurements of the cosmological perturbations, such as redshift-space distortions and weak gravitational lensing. These additional probes are important tools for testing MG models and for breaking degeneracies that are still present in the combination of Planck and background data sets. All results that include only background parameterizations are in agreement with LCDM. When testing models that also change perturbations (even when the background is fixed to LCDM), some tensions appear in a few scenarios: the maximum one found is \sim 2 sigma for Planck TT+lowP when parameterizing observables related to the gravitational potentials with a chosen time dependence; the tension increases to at most 3 sigma when external data sets are included. It however disappears when including CMB lensing

    Planck 2015 results. XIII. Cosmological parameters

    Get PDF
    We present results based on full-mission Planck observations of temperature and polarization anisotropies of the CMB. These data are consistent with the six-parameter inflationary LCDM cosmology. From the Planck temperature and lensing data, for this cosmology we find a Hubble constant, H0= (67.8 +/- 0.9) km/s/Mpc, a matter density parameter Omega_m = 0.308 +/- 0.012 and a scalar spectral index with n_s = 0.968 +/- 0.006. (We quote 68% errors on measured parameters and 95% limits on other parameters.) Combined with Planck temperature and lensing data, Planck LFI polarization measurements lead to a reionization optical depth of tau = 0.066 +/- 0.016. Combining Planck with other astrophysical data we find N_ eff = 3.15 +/- 0.23 for the effective number of relativistic degrees of freedom and the sum of neutrino masses is constrained to < 0.23 eV. Spatial curvature is found to be |Omega_K| < 0.005. For LCDM we find a limit on the tensor-to-scalar ratio of r <0.11 consistent with the B-mode constraints from an analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP data leads to a tighter constraint of r < 0.09. We find no evidence for isocurvature perturbations or cosmic defects. The equation of state of dark energy is constrained to w = -1.006 +/- 0.045. Standard big bang nucleosynthesis predictions for the Planck LCDM cosmology are in excellent agreement with observations. We investigate annihilating dark matter and deviations from standard recombination, finding no evidence for new physics. The Planck results for base LCDM are in agreement with BAO data and with the JLA SNe sample. However the amplitude of the fluctuations is found to be higher than inferred from rich cluster counts and weak gravitational lensing. Apart from these tensions, the base LCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets

    Planck 2015 results I. Overview of products and scientific results

    Get PDF
    The European Space Agency's Planck satellite, which is dedicated to studying the early Universe and its subsequent evolution, was launched on 14 May 2009. It scanned the microwave and submillimetre sky continuously between 12 August 2009 and 23 October 2013. In February 2015, ESA and the Planck Collaboration released the second set of cosmology products based on data from the entire Planck mission, including both temperature and polarization, along with a set of scientific and technical papers and a web-based explanatory supplement. This paper gives an overview of the main characteristics of the data and the data products in the release, as well as the associated cosmological and astrophysical science results and papers. The data products include maps of the cosmic microwave background (CMB), the thermal Sunyaev-Zeldovich effect, diffuse foregrounds in temperature and polarization, catalogues of compact Galactic and extragalactic sources (including separate catalogues of Sunyaev-Zeldovich clusters and Galactic cold clumps), and extensive simulations of signals and noise used in assessing uncertainties and the performance of the analysis methods. The likelihood code used to assess cosmological models against the Planck data is described, along with a CMB lensing likelihood. Scientific results include cosmological parameters derived from CMB power spectra, gravitational lensing, and cluster counts, as well as constraints on inflation, non-Gaussianity, primordial magnetic fields, dark energy, and modified gravity, and new results on low-frequency Galactic foregrounds

    RaVÆn: unsupervised change detection of extreme events using ML on-board satellites

    No full text
    Applications such as disaster management enormously benefit from rapid availability of satellite observations. Traditionally, data analysis is performed on the ground after being transferred—downlinked—to a ground station. Constraints on the downlink capabilities, both in terms of data volume and timing, therefore heavily affect the response delay of any downstream application. In this paper, we introduce RaVÆn, a lightweight, unsupervised approach for change detection in satellite data based on Variational Auto-Encoders (VAEs), with the specific purpose of on-board deployment. RaVÆn pre-processes the sampled data directly on the satellite and flags changed areas to prioritise for downlink, shortening the response time. We verified the efficacy of our system on a dataset—which we release alongside this publication—composed of time series containing a catastrophic event, demonstrating that RaVÆn outperforms pixel-wise baselines. Finally, we tested our approach on resource-limited hardware for assessing computational and memory limitations, simulating deployment on real hardware

    Unsupervised change detection of extreme events using ML on-board

    No full text
    In this paper, we introduce RaVAEn, a lightweight, unsupervised approach for change detection in satellite data based on Variational Auto-Encoders (VAEs) with the specific purpose of on-board deployment. Applications such as disaster management enormously benefit from the rapid availability of satellite observations. Traditionally, data analysis is performed on the ground after all data is transferred - downlinked - to a ground station. Constraint on the downlink capabilities therefore affects any downstream application. In contrast, RaVAEn pre-processes the sampled data directly on the satellite and flags changed areas to prioritise for downlink, shortening the response time. We verified the efficacy of our system on a dataset composed of time series of catastrophic events - which we plan to release alongside this publication - demonstrating that RaVAEn outperforms pixel-wise baselines. Finally we tested our approach on resource-limited hardware for assessing computational and memory limitations

    Unsupervised change detection of extreme events using ML on-board

    No full text
    In this paper, we introduce RaVAEn, a lightweight, unsupervised approach for change detection in satellite data based on Variational Auto-Encoders (VAEs) with the specific purpose of on-board deployment. Applications such as disaster management enormously benefit from the rapid availability of satellite observations. Traditionally, data analysis is performed on the ground after all data is transferred - downlinked - to a ground station. Constraint on the downlink capabilities therefore affects any downstream application. In contrast, RaVAEn pre-processes the sampled data directly on the satellite and flags changed areas to prioritise for downlink, shortening the response time. We verified the efficacy of our system on a dataset composed of time series of catastrophic events - which we plan to release alongside this publication - demonstrating that RaVAEn outperforms pixel-wise baselines. Finally we tested our approach on resource-limited hardware for assessing computational and memory limitations
    • …
    corecore