2,893 research outputs found

    An optimal data ordering scheme for Dirichlet process mixture models

    Get PDF
    In recent years, there has been increasing interest in Bayesian nonparametric methods due to their flexibility, and the availability of Markov chain Monte Carlo (MCMC) methods for sampling from the posterior distribution. As MCMC methods are generally time consuming for computation, there is a need for faster methods, which can be executed within a matter of seconds. A fast alternative to MCMC for sampling the well known and widely used Dirichlet process mixture (DPM) model is investigated to draw approximate independent and identically distributed samples from the posterior distribution of the latent allocations, and then to draw samples from the weights and locations conditional on the allocations. To address the order depend issue of the proposed algorithm, an optimal ordering scheme based on a sequence of optimizations is proposed to first obtain an optimal order of the data, and then run the algorithm on this ordering. The fast sampling algorithm is assisted by parallel computing using commands within MATLA

    Approximate Decentralized Bayesian Inference

    Get PDF
    This paper presents an approximate method for performing Bayesian inference in models with conditional independence over a decentralized network of learning agents. The method first employs variational inference on each individual learning agent to generate a local approximate posterior, the agents transmit their local posteriors to other agents in the network, and finally each agent combines its set of received local posteriors. The key insight in this work is that, for many Bayesian models, approximate inference schemes destroy symmetry and dependencies in the model that are crucial to the correct application of Bayes' rule when combining the local posteriors. The proposed method addresses this issue by including an additional optimization step in the combination procedure that accounts for these broken dependencies. Experiments on synthetic and real data demonstrate that the decentralized method provides advantages in computational performance and predictive test likelihood over previous batch and distributed methods.Comment: This paper was presented at UAI 2014. Please use the following BibTeX citation: @inproceedings{Campbell14_UAI, Author = {Trevor Campbell and Jonathan P. How}, Title = {Approximate Decentralized Bayesian Inference}, Booktitle = {Uncertainty in Artificial Intelligence (UAI)}, Year = {2014}

    Optimal client recommendation for market makers in illiquid financial products

    Full text link
    The process of liquidity provision in financial markets can result in prolonged exposure to illiquid instruments for market makers. In this case, where a proprietary position is not desired, pro-actively targeting the right client who is likely to be interested can be an effective means to offset this position, rather than relying on commensurate interest arising through natural demand. In this paper, we consider the inference of a client profile for the purpose of corporate bond recommendation, based on typical recorded information available to the market maker. Given a historical record of corporate bond transactions and bond meta-data, we use a topic-modelling analogy to develop a probabilistic technique for compiling a curated list of client recommendations for a particular bond that needs to be traded, ranked by probability of interest. We show that a model based on Latent Dirichlet Allocation offers promising performance to deliver relevant recommendations for sales traders.Comment: 12 pages, 3 figures, 1 tabl

    Streaming, Distributed Variational Inference for Bayesian Nonparametrics

    Full text link
    This paper presents a methodology for creating streaming, distributed inference algorithms for Bayesian nonparametric (BNP) models. In the proposed framework, processing nodes receive a sequence of data minibatches, compute a variational posterior for each, and make asynchronous streaming updates to a central model. In contrast to previous algorithms, the proposed framework is truly streaming, distributed, asynchronous, learning-rate-free, and truncation-free. The key challenge in developing the framework, arising from the fact that BNP models do not impose an inherent ordering on their components, is finding the correspondence between minibatch and central BNP posterior components before performing each update. To address this, the paper develops a combinatorial optimization problem over component correspondences, and provides an efficient solution technique. The paper concludes with an application of the methodology to the DP mixture model, with experimental results demonstrating its practical scalability and performance.Comment: This paper was presented at NIPS 2015. Please use the following BibTeX citation: @inproceedings{Campbell15_NIPS, Author = {Trevor Campbell and Julian Straub and John W. {Fisher III} and Jonathan P. How}, Title = {Streaming, Distributed Variational Inference for Bayesian Nonparametrics}, Booktitle = {Advances in Neural Information Processing Systems (NIPS)}, Year = {2015}

    A New Approach to Probabilistic Programming Inference

    Full text link
    We introduce and demonstrate a new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo. Our approach is simple to implement and easy to parallelize. It applies to Turing-complete probabilistic programming languages and supports accurate inference in models that make use of complex control flow, including stochastic recursion. It also includes primitives from Bayesian nonparametric statistics. Our experiments show that this approach can be more efficient than previously introduced single-site Metropolis-Hastings methods.Comment: Updated version of the 2014 AISTATS paper (to reflect changes in new language syntax). 10 pages, 3 figures. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, Vol 33, 201
    • …
    corecore