20 research outputs found

    R-VGAL: A Sequential Variational Bayes Algorithm for Generalised Linear Mixed Models

    Full text link
    Models with random effects, such as generalised linear mixed models (GLMMs), are often used for analysing clustered data. Parameter inference with these models is difficult because of the presence of cluster-specific random effects, which must be integrated out when evaluating the likelihood function. Here, we propose a sequential variational Bayes algorithm, called Recursive Variational Gaussian Approximation for Latent variable models (R-VGAL), for estimating parameters in GLMMs. The R-VGAL algorithm operates on the data sequentially, requires only a single pass through the data, and can provide parameter updates as new data are collected without the need of re-processing the previous data. At each update, the R-VGAL algorithm requires the gradient and Hessian of a "partial" log-likelihood function evaluated at the new observation, which are generally not available in closed form for GLMMs. To circumvent this issue, we propose using an importance-sampling-based approach for estimating the gradient and Hessian via Fisher's and Louis' identities. We find that R-VGAL can be unstable when traversing the first few data points, but that this issue can be mitigated by using a variant of variational tempering in the initial steps of the algorithm. Through illustrations on both simulated and real datasets, we show that R-VGAL provides good approximations to the exact posterior distributions, that it can be made robust through tempering, and that it is computationally efficient

    A Cyber Physical System Crowdsourcing Inference Method Based on Tempering: An Advancement in Artificial Intelligence Algorithms

    Get PDF
    Activity selection is critical for the smart environment and Cyber-Physical Systems (CPSs) that can provide timely and intelligent services, especially as the number of connected devices is increasing at an unprecedented speed. As it is important to collect labels by various agents in the CPSs, crowdsourcing inference algorithms are designed to help acquire accurate labels that involve high-level knowledge. However, there are some limitations in the algorithm in the existing literature such as incurring extra budget for the existing algorithms, inability to scale appropriately, requiring the knowledge of prior distribution, difficulties to implement these algorithms, or generating local optima. In this paper, we provide a crowdsourcing inference method with variational tempering that obtains ground truth as well as considers both the reliability of workers and the difficulty level of the tasks and ensure a local optimum. The numerical experiments of the real-world data indicate that our novel variational tempering inference algorithm performs better than the existing advancing algorithms. Therefore, this paper provides a new efficient algorithm in CPSs and machine learning, and thus, it makes a new contribution to the literature

    Variational channel estimation with tempering: An artificial intelligence algorithm for wireless intelligent networks

    Get PDF
    This article belongs to the Special Issue Trends on Edge Computing and Artificial Intelligence for Next Generation Sensor Network

    Factorized Variational Autoencoders for Modeling Audience Reactions to Movies

    Get PDF
    Matrix and tensor factorization methods are often used for finding underlying low-dimensional patterns from noisy data. In this paper, we study non-linear tensor factorization methods based on deep variational autoencoders. Our approach is well-suited for settings where the relationship between the latent representation to be learned and the raw data representation is highly complex. We apply our approach to a large dataset of facial expressions of movie-watching audiences (over 16 million faces). Our experiments show that compared to conventional linear factorization methods, our method achieves better reconstruction of the data, and further discovers interpretable latent factors

    A view of Estimation of Distribution Algorithms through the lens of Expectation-Maximization

    Full text link
    We show that a large class of Estimation of Distribution Algorithms, including, but not limited to, Covariance Matrix Adaption, can be written as a Monte Carlo Expectation-Maximization algorithm, and as exact EM in the limit of infinite samples. Because EM sits on a rigorous statistical foundation and has been thoroughly analyzed, this connection provides a new coherent framework with which to reason about EDAs

    Boosting Variational Inference: an Optimization Perspective

    Full text link
    Variational inference is a popular technique to approximate a possibly intractable Bayesian posterior with a more tractable one. Recently, boosting variational inference has been proposed as a new paradigm to approximate the posterior by a mixture of densities by greedily adding components to the mixture. However, as is the case with many other variational inference algorithms, its theoretical properties have not been studied. In the present work, we study the convergence properties of this approach from a modern optimization viewpoint by establishing connections to the classic Frank-Wolfe algorithm. Our analyses yields novel theoretical insights regarding the sufficient conditions for convergence, explicit rates, and algorithmic simplifications. Since a lot of focus in previous works for variational inference has been on tractability, our work is especially important as a much needed attempt to bridge the gap between probabilistic models and their corresponding theoretical properties
    corecore