4,980 research outputs found
Random Effects Models with Deep Neural Network Basis Functions: Methodology and Computation
Deep neural networks (DNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a deep neural network. The consideration of neural networks with random effects seems little used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for Bayesian inference are developed based on Gaussian variational approximation methods. A parsimonious but flexible factor parametrization of the covariance matrix is used in the Gaussian variational approximation. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix to perform fast matrix vector multiplications in iterative conjugate gradient linear solvers in natural gradient computations. The method can be implemented in high dimensions, and the use of the natural gradient allows faster and more stable convergence of the variational algorithm. In the case of random effects, we compute unbiased estimates of the gradient of the lower bound in the model with the random effects integrated out by making use of Fisher's identity. The proposed methods are illustrated in several examples for DNN random effects models and high-dimensional logistic regression with sparse signal shrinkage priors
Variational Inference in Nonconjugate Models
Mean-field variational methods are widely used for approximate posterior
inference in many probabilistic models. In a typical application, mean-field
methods approximately compute the posterior with a coordinate-ascent
optimization algorithm. When the model is conditionally conjugate, the
coordinate updates are easily derived and in closed form. However, many models
of interest---like the correlated topic model and Bayesian logistic
regression---are nonconjuate. In these models, mean-field methods cannot be
directly applied and practitioners have had to develop variational algorithms
on a case-by-case basis. In this paper, we develop two generic methods for
nonconjugate models, Laplace variational inference and delta method variational
inference. Our methods have several advantages: they allow for easily derived
variational algorithms with a wide class of nonconjugate models; they extend
and unify some of the existing algorithms that have been derived for specific
models; and they work well on real-world datasets. We studied our methods on
the correlated topic model, Bayesian logistic regression, and hierarchical
Bayesian logistic regression
- …