7 research outputs found

    View-based 3D Objects Recognition with Expectation Propagation Learning

    Get PDF
    In this thesis, we present an improvement on the Expectation Propagation learning framework, specifically various enhancements on both speed and accuracy. We use this enhanced EP learning with the Inverted Dirichlet mixture model as well as the Dirichlet mixture model, to implement an algorithm to recognize 3D objects. Those objects are in our case from a view-based 3D models database that we have assembled. Following specific rules determined by analyzing the results of our tests, we’ve been able to get good recognition rates. Experimental results are presented with different object classes by comparing recognition rates and confidence level, according to different tuning parameters we’re able to refine towards specific classes for better specialized accuracy

    Approximate Bayesian Inference for Count Data Modeling

    Get PDF
    Bayesian inference allows to make conclusions based on some antecedents that depend on prior knowledge. It additionally allows to quantify uncertainty, which is important in Machine Learning in order to make better predictions and model interpretability. However, in real applications, we often deal with complicated models for which is unfeasible to perform full Bayesian inference. This thesis explores the use of approximate Bayesian inference for count data modeling using Expectation Propagation and Stochastic Expectation Propagation. In Chapter 2, we develop an expectation propagation approach to learn an EDCM finite mixture model. The EDCM distribution is an exponential approximation to the widely used Dirichlet Compound distribution and has shown to offer excellent modeling capabilities in the case of sparse count data. Chapter 3 develops an efficient generative mixture model of EMSD distributions. We use Stochastic Expectation Propagation, which reduces memory consumption, important characteristic when making inference in large datasets. Finally, Chapter 4 develops a probabilistic topic model using the generalized Dirichlet distribution (LGDA) in order to capture topic correlation while maintaining conjugacy. We make use of Expectation Propagation to approximate the posterior, resulting in a model that achieves more accurate inference compared to variational inference. We show that latent topics can be used as a proxy for improving supervised tasks
    corecore