research

Conditionally conjugate mean-field variational Bayes for logistic models

Abstract

Variational Bayes (VB) is a common strategy for approximate Bayesian inference, but simple methods are only available for specific classes of models including, in particular, representations having conditionally conjugate constructions within an exponential family. Models with logit components are an apparently notable exception to this class, due to the absence of conjugacy between the logistic likelihood and the Gaussian priors for the coefficients in the linear predictor. To facilitate approximate inference within this widely used class of models, Jaakkola and Jordan (2000) proposed a simple variational approach which relies on a family of tangent quadratic lower bounds of logistic log-likelihoods, thus restoring conjugacy between these approximate bounds and the Gaussian priors. This strategy is still implemented successfully, but less attempts have been made to formally understand the reasons underlying its excellent performance. To cover this key gap, we provide a formal connection between the above bound and a recent P\'olya-gamma data augmentation for logistic regression. Such a result places the computational methods associated with the aforementioned bounds within the framework of variational inference for conditionally conjugate exponential family models, thereby allowing recent advances for this class to be inherited also by the methods relying on Jaakkola and Jordan (2000)

    Similar works