2,837 research outputs found

    Gaussian process quantile regression using expectation propagation

    Get PDF
    Direct quantile regression involves estimating a given quantile of a response variable as a function of input variables. We present a new framework for direct quantile regression where a Gaussian process model is learned, minimising the expected tilted loss function. The integration required in learning is not analytically tractable so to speed up the learning we employ the Expectation Propagation algorithm. We describe how this work relates to other quantile regression methods and apply the method on both synthetic and real data sets. The method is shown to be competitive with state of the art methods whilst allowing for the leverage of the full Gaussian process probabilistic framework

    Kernel conditional quantile estimation via reduction revisited

    Get PDF
    Quantile regression refers to the process of estimating the quantiles of a conditional distribution and has many important applications within econometrics and data mining, among other domains. In this paper, we show how to estimate these conditional quantile functions within a Bayes risk minimization framework using a Gaussian process prior. The resulting non-parametric probabilistic model is easy to implement and allows non-crossing quantile functions to be enforced. Moreover, it can directly be used in combination with tools and extensions of standard Gaussian Processes such as principled hyperparameter estimation, sparsification, and quantile regression with input-dependent noise rates. No existing approach enjoys all of these desirable properties. Experiments on benchmark datasets show that our method is competitive with state-of-the-art approaches.

    Disease Progression Modeling and Prediction through Random Effect Gaussian Processes and Time Transformation

    Get PDF
    The development of statistical approaches for the joint modelling of the temporal changes of imaging, biochemical, and clinical biomarkers is of paramount importance for improving the understanding of neurodegenerative disorders, and for providing a reference for the prediction and quantification of the pathology in unseen individuals. Nonetheless, the use of disease progression models for probabilistic predictions still requires investigation, for example for accounting for missing observations in clinical data, and for accurate uncertainty quantification. We tackle this problem by proposing a novel Gaussian process-based method for the joint modeling of imaging and clinical biomarker progressions from time series of individual observations. The model is formulated to account for individual random effects and time reparameterization, allowing non-parametric estimates of the biomarker evolution, as well as high flexibility in specifying correlation structure, and time transformation models. Thanks to the Bayesian formulation, the model naturally accounts for missing data, and allows for uncertainty quantification in the estimate of evolutions, as well as for probabilistic prediction of disease staging in unseen patients. The experimental results show that the proposed model provides a biologically plausible description of the evolution of Alzheimer's pathology across the whole disease time-span as well as remarkable predictive performance when tested on a large clinical cohort with missing observations.Comment: 13 pages, 2 figure

    High-dimensional Bayesian optimization with intrinsically low-dimensional response surfaces

    Get PDF
    Bayesian optimization is a powerful technique for the optimization of expensive black-box functions. It is used in a wide range of applications such as in drug and material design and training of machine learning models, e.g. large deep networks. We propose to extend this approach to high-dimensional settings, that is where the number of parameters to be optimized exceeds 10--20. In this thesis, we scale Bayesian optimization by exploiting different types of projections and the intrinsic low-dimensionality assumption of the objective function. We reformulate the problem in a low-dimensional subspace and learn a response surface and maximize an acquisition function in this low-dimensional projection. Contributions include i) a probabilistic model for axis-aligned projections, such as the quantile-Gaussian process and ii) a probabilistic model for learning a feature space by means of manifold Gaussian processes. In the latter contribution, we propose to learn a low-dimensional feature space jointly with (a) the response surface and (b) a reconstruction mapping. Finally, we present empirical results against well-known baselines in high-dimensional Bayesian optimization and provide possible directions for future research in this field.Open Acces

    Uncertainty and sensitivity analysis of functional risk curves based on Gaussian processes

    Full text link
    A functional risk curve gives the probability of an undesirable event as a function of the value of a critical parameter of a considered physical system. In several applicative situations, this curve is built using phenomenological numerical models which simulate complex physical phenomena. To avoid cpu-time expensive numerical models, we propose to use Gaussian process regression to build functional risk curves. An algorithm is given to provide confidence bounds due to this approximation. Two methods of global sensitivity analysis of the models' random input parameters on the functional risk curve are also studied. In particular, the PLI sensitivity indices allow to understand the effect of misjudgment on the input parameters' probability density functions
    • …
    corecore