9,420 research outputs found
Prior distributions for objective Bayesian analysis
We provide a review of prior distributions for objective Bayesian analysis. We start by examining some foundational issues and then organize our exposition into priors for: i) estimation or prediction; ii) model selection; iii) highdimensional models. With regard to i), we present some basic notions, and then move to more recent contributions on discrete parameter space, hierarchical models, nonparametric models, and penalizing complexity priors. Point ii) is the focus of this paper: it discusses principles for objective Bayesian model comparison, and singles out some major concepts for building priors, which are subsequently illustrated in some detail for the classic problem of variable selection in normal linear models. We also present some recent contributions in the area of objective priors on model space.With regard to point iii) we only provide a short summary of some default priors for high-dimensional models, a rapidly growing area of research
Bayesian Model Comparison in Genetic Association Analysis: Linear Mixed Modeling and SNP Set Testing
We consider the problems of hypothesis testing and model comparison under a
flexible Bayesian linear regression model whose formulation is closely
connected with the linear mixed effect model and the parametric models for SNP
set analysis in genetic association studies. We derive a class of analytic
approximate Bayes factors and illustrate their connections with a variety of
frequentist test statistics, including the Wald statistic and the variance
component score statistic. Taking advantage of Bayesian model averaging and
hierarchical modeling, we demonstrate some distinct advantages and
flexibilities in the approaches utilizing the derived Bayes factors in the
context of genetic association studies. We demonstrate our proposed methods
using real or simulated numerical examples in applications of single SNP
association testing, multi-locus fine-mapping and SNP set association testing
Criteria for Bayesian model choice with application to variable selection
In objective Bayesian model selection, no single criterion has emerged as
dominant in defining objective prior distributions. Indeed, many criteria have
been separately proposed and utilized to propose differing prior choices. We
first formalize the most general and compelling of the various criteria that
have been suggested, together with a new criterion. We then illustrate the
potential of these criteria in determining objective model selection priors by
considering their application to the problem of variable selection in normal
linear models. This results in a new model selection objective prior with a
number of compelling properties.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1013 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Objective Bayes testing of Poisson versus inflated Poisson models
The Poisson distribution is often used as a standard model for count data.
Quite often, however, such data sets are not well fit by a Poisson model
because they have more zeros than are compatible with this model. For these
situations, a zero-inflated Poisson (ZIP) distribution is often proposed. This
article addresses testing a Poisson versus a ZIP model, using Bayesian
methodology based on suitable objective priors. Specific choices of objective
priors are justified and their properties investigated. The methodology is
extended to include covariates in regression models. Several applications are
given.Comment: Published in at http://dx.doi.org/10.1214/074921708000000093 the IMS
Collections (http://www.imstat.org/publications/imscollections.htm) by the
Institute of Mathematical Statistics (http://www.imstat.org
Objective Bayes Factors for Gaussian Directed Acyclic Graphical Models
We propose an objective Bayesian method for the comparison of all Gaussian directed acyclic graphical models defined on a given set of variables. The method, which is based on the notion of fractional Bayes factor, requires a single default (typically improper) prior on the space of unconstrained covariance matrices, together with a prior sample size hyper-parameter, which can be set to its minimal value. We show that our approach produces genuine Bayes factors. The implied prior on the concentration matrix of any complete graph is a data-dependent Wishart distribution, and this in turn guarantees that Markov equivalent graphs are scored with the same marginal likelihood. We specialize our results to the smaller class of Gaussian decomposable undirected graphical models, and show that in this case they coincide with those recently obtained using limiting versions of hyper-inverse Wishart distributions as priors on the graph-constrained covariance matrices.Bayes factor; Bayesian model selection; Directed acyclic graph; Exponential family; Fractional Bayes factor; Gaussian graphical model; Objective Bayes;Standard conjugate prior; Structural learning. network; Stochastic search; Structural learning.
- …