282 research outputs found

    Fully simplified multivariate normal updates in non-conjugate variational message passing

    Full text link
    Fully simplified expressions for Multivariate Normal updates in non-conjugate variational message passing approximate inference schemes are obtained. The simplicity of these expressions means that the updates can be achieved very eficiently. Since the Multivariate Normal family is the most common for approximating the joint posterior density function of a continuous parameter vector, these fully simplified updates are of great practical benefit. © 2014 Matt P. Wand

    Penalized wavelets: Embedding wavelets into semiparametric regression

    Full text link
    We introduce the concept of penalized wavelets to facilitate seamless embedding of wavelets into semiparametric regression models. In particular, we show that penalized wavelets are analogous to penalized splines; the latter being the established approach to function estimation in semiparametric regression. They differ only in the type of penalization that is appropriate. This fact is not borne out by the existing wavelet literature, where the regression modelling and fitting issues are overshadowed by computational issues such as efficiency gains afforded by the Discrete Wavelet Transform and partially obscured by a tendency to work in the wavelet coefficient space. With penalized wavelet structure in place, we then show that fitting and inference can be achieved via the same general approaches used for penalized splines: penalized least squares, maximum likelihood and best prediction within a frequentist mixed model framework, and Markov chain Monte Carlo and mean field variational Bayes within a Bayesian framework. Penalized wavelets are also shown have a close relationship with wide data ("p ≫ ≫ n") regression and benefit from ongoing research on that topic

    The explicit form of expectation propagation for a simple statistical model

    Full text link
    © 2016, Institute of Mathematical Statistics. All rights reserved. We derive the explicit form of expectation propagation for approximate deterministic Bayesian inference in a simple statistical model. The model corresponds to a random sample from the Normal distribution. The explicit forms, and their derivation, allow a deeper understanding of the issues and challenges involved in practical implementation of expectation propagation for statistical analyses. No auxiliary approximations are used: we follow the expectation propagation prescription exactly. A simulation study shows expectation propagation to be more accurate than mean field variational Bayes for larger sample sizes, but at the cost of considerably more algebraic and computational effort

    Variational inference for count response semiparametric regression

    Full text link
    © 2015 International Society for Bayesian Analysis. Fast variational approximate algorithms are developed for Bayesian semiparametric regression when the response variable is a count, i.e., a nonnegative integer. We treat both the Poisson and Negative Binomial families as models for the response variable. Our approach utilizes recently developed methodology known as non-conjugate variational message passing. For concreteness, we focus on generalized additive mixed models, although our variational approximation approach extends to a wide class of semiparametric regression models such as those containing interactions and elaborate random effect structure

    Explicit connections between longitudinal data analysis and kernel machines

    Get PDF
    Two areas of research – longitudinal data analysis and kernel machines – have large, but mostly distinct, literatures. This article shows explicitly that both fields have much in common with each other. In particular, many popular longitudinal data fitting procedures are special types of kernel machines. These connections have the potential to provide fruitful cross-fertilization between longitudinal data analytic and kernel machine methodology. © 2009, Institute of Mathematical Statistics. All rights reserved

    Mean field variational bayes for continuous sparse signal shrinkage: Pitfalls and remedies

    Get PDF
    © 2014, Institute of Mathematical Statistics. All rights received. We investigate mean field variational approximate Bayesian inference for models that use continuous distributions, Horseshoe, Negative-Exponential-Gamma and Generalized Double Pareto, for sparse signal shrinkage. Our principal finding is that the most natural, and simplest, mean field variational Bayes algorithm can perform quite poorly due to posterior dependence among auxiliary variables. More sophisticated algorithms, based on special functions, are shown to be superior. Continued fraction approximations via Lentz's Algorithm are developed to make the algorithms practical

    Functional regression via variational bayes

    Full text link
    We introduce variational Bayes methods for fast approximate inference in functional regression analysis. Both the standard cross-sectional and the increasingly common longitudinal settings are treated. The method- ology allows Bayesian functional regression analyses to be conducted with- out the computational overhead of Monte Carlo methods. Confidence in- tervals of the model parameters are obtained both using the approximate variational approach and nonparametric resampling of clusters. The latter approach is possible because our variational Bayes functional regression ap- proach is computationally efficient. A simulation study indicates that varia- tional Bayes is highly accurate in estimating the parameters of interest and in approximating the Markov chain Monte Carlo-sampled joint posterior distribution of the model parameters. The methods apply generally, but are motivated by a longitudinal neuroimaging study of multiple sclerosis patients. Code used in simulations is made available as a web-supplement

    Asymptotics for general multivariate kernel density derivative estimators

    Get PDF
    We investigate kernel estimators of multivariate density derivative functions using general (or unconstrained) bandwidth matrix selectors. These density derivative estimators have been relatively less well researched than their density estimator analogues. A major obstacle for progress has been the intractability of the matrix analysis when treating higher order multivariate derivatives. With an alternative vectorization of these higher order derivatives, mathematical intractabilities are surmounted in an elegant and unified framework. The finite sample and asymptotic analysis of squared errors for density estimators are generalized to density derivative estimators. Moreover, we are able to exhibit a closed form expression for a normal scale bandwidth matrix for density derivative estimators. These normal scale bandwidths are employed in a numerical study to demonstrate the gain in performance of unconstrained selectors over their constrained counterparts

    Theory of Gaussian variational approximation for a Poisson mixed model

    Get PDF
    Likelihood-based inference for the parameters of generalized linear mixed models is hindered by the presence of intractable integrals. Gaussian variational approximation provides a fast and effective means of approximate inference. We provide some theory for this type of approximation for a simple Poisson mixed model. In particular, we establish consistency at rate m -1/2 +n-1, where m is the number of groups and n is the number of repeated measurements
    • …
    corecore