3,344 research outputs found

    Bayesian Nonparametric Calibration and Combination of Predictive Distributions

    Get PDF
    We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration functionals and random combination weights. Building on the work of Ranjan, R. and Gneiting, T. (2010) and Gneiting, T. and Ranjan, R. (2013), we use infinite beta mixtures for the calibration. The proposed Bayesian nonparametric approach takes advantage of the flexibility of Dirichlet process mixtures to achieve any continuous deformation of linearly combined predictive distributions. The inference procedure is based on Gibbs sampling and allows accounting for uncertainty in the number of mixture components, mixture weights, and calibration parameters. The weak posterior consistency of the Bayesian nonparametric calibration is provided under suitable conditions for unknown true density. We study the methodology in simulation examples with fat tails and multimodal densities and apply it to density forecasts of daily S&P returns and daily maximum wind speed at the Frankfurt airport.Comment: arXiv admin note: text overlap with arXiv:1305.2026 by other author

    Local Exchangeability

    Full text link
    Exchangeability---in which the distribution of an infinite sequence is invariant to reorderings of its elements---implies the existence of a simple conditional independence structure that may be leveraged in the design of probabilistic models, efficient inference algorithms, and randomization-based testing procedures. In practice, however, this assumption is too strong an idealization; the distribution typically fails to be exactly invariant to permutations and de Finetti's representation theory does not apply. Thus there is the need for a distributional assumption that is both weak enough to hold in practice, and strong enough to guarantee a useful underlying representation. We introduce a relaxed notion of local exchangeability---where swapping data associated with nearby covariates causes a bounded change in the distribution. We prove that locally exchangeable processes correspond to independent observations from an underlying measure-valued stochastic process. We thereby show that de Finetti's theorem is robust to perturbation and provide further justification for the Bayesian modelling approach. Using this probabilistic result, we develop three novel statistical procedures for (1) estimating the underlying process via local empirical measures, (2) testing via local randomization, and (3) estimating the canonical premetric of local exchangeability. These three procedures extend the applicability of previous exchangeability-based methods without sacrificing rigorous statistical guarantees. The paper concludes with examples of popular statistical models that exhibit local exchangeability

    Generalized Dirichlet distributions on the ball and moments

    Full text link
    The geometry of unit NN-dimensional â„“p\ell_{p} balls has been intensively investigated in the past decades. A particular topic of interest has been the study of the asymptotics of their projections. Apart from their intrinsic interest, such questions have applications in several probabilistic and geometric contexts (Barthe et al. 2005). In this paper, our aim is to revisit some known results of this flavour with a new point of view. Roughly speaking, we will endow the ball with some kind of Dirichlet distribution that generalizes the uniform one and will follow the method developed in Skibinsky (1967), Chang et al. (1993) in the context of the randomized moment space. The main idea is to build a suitable coordinate change involving independent random variables. Moreover, we will shed light on a nice connection between the randomized balls and the randomized moment space.Comment: Last section modified. Article accepted by ALE

    Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Full text link
    We present a novel Bayesian nonparametric regression model for covariates X and continuous, real response variable Y. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F(y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates

    The Bayesian sampler : generic Bayesian inference causes incoherence in human probability

    Get PDF
    Human probability judgments are systematically biased, in apparent tension with Bayesian models of cognition. But perhaps the brain does not represent probabilities explicitly, but approximates probabilistic calculations through a process of sampling, as used in computational probabilistic models in statistics. Naïve probability estimates can be obtained by calculating the relative frequency of an event within a sample, but these estimates tend to be extreme when the sample size is small. We propose instead that people use a generic prior to improve the accuracy of their probability estimates based on samples, and we call this model the Bayesian sampler. The Bayesian sampler trades off the coherence of probabilistic judgments for improved accuracy, and provides a single framework for explaining phenomena associated with diverse biases and heuristics such as conservatism and the conjunction fallacy. The approach turns out to provide a rational reinterpretation of “noise” in an important recent model of probability judgment, the probability theory plus noise model (Costello & Watts, 2014, 2016a, 2017; Costello & Watts, 2019; Costello, Watts, & Fisher, 2018), making equivalent average predictions for simple events, conjunctions, and disjunctions. The Bayesian sampler does, however, make distinct predictions for conditional probabilities and distributions of probability estimates. We show in 2 new experiments that this model better captures these mean judgments both qualitatively and quantitatively; which model best fits individual distributions of responses depends on the assumed size of the cognitive sample

    Four moments theorems on Markov chains

    Full text link
    We obtain quantitative Four Moments Theorems establishing convergence of the laws of elements of a Markov chaos to a Pearson distribution, where the only assumptionwemake on the Pearson distribution is that it admits four moments. While in general one cannot use moments to establish convergence to a heavy-tailed distributions, we provide a context in which only the first four moments suffices. These results are obtained by proving a general carré du champ bound on the distance between laws of random variables in the domain of a Markov diffusion generator and invariant measures of diffusions. For elements of a Markov chaos, this bound can be reduced to just the first four moments.First author draf

    Four moments theorems on Markov chaos

    Get PDF
    We obtain quantitative Four Moments Theorems establishing convergence of the laws of elements of a Markov chaos to a Pearson distribution, where the only assumption we make on the Pearson distribution is that it admits four moments. While in general one cannot use moments to establish convergence to a heavy-tailed distributions, we provide a context in which only the first four moments suffices. These results are obtained by proving a general carr\'e du champ bound on the distance between laws of random variables in the domain of a Markov diffusion generator and invariant measures of diffusions. For elements of a Markov chaos, this bound can be reduced to just the first four moments.Comment: 24 page
    • …
    corecore