2,740 research outputs found
Recommended from our members
Particle Gibbs for Infinite Hidden Markov Models
This is the final version of the article. It first appeared from Curran Associates via http://papers.nips.cc/paper/5968-particle-gibbs-for-infinite-hidden-markov-modelsInfinite Hidden Markov Models (iHMM’s) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system. However, due to the infinite-dimensional nature of the transition dynamics, performing inference in the iHMM is difficult. In this paper, we present an infinite-state Particle Gibbs (PG) algorithm to resample state trajectories for the iHMM. The proposed algorithm uses an efficient proposal optimized for iHMMs and leverages ancestor sampling to improve the mixing of the standard PG algorithm. Our algorithm demonstrates significant convergence improvements on synthetic and real world data sets
Infinite Factorial Finite State Machine for Blind Multiuser Channel Estimation
New communication standards need to deal with machine-to-machine
communications, in which users may start or stop transmitting at any time in an
asynchronous manner. Thus, the number of users is an unknown and time-varying
parameter that needs to be accurately estimated in order to properly recover
the symbols transmitted by all users in the system. In this paper, we address
the problem of joint channel parameter and data estimation in a multiuser
communication channel in which the number of transmitters is not known. For
that purpose, we develop the infinite factorial finite state machine model, a
Bayesian nonparametric model based on the Markov Indian buffet that allows for
an unbounded number of transmitters with arbitrary channel length. We propose
an inference algorithm that makes use of slice sampling and particle Gibbs with
ancestor sampling. Our approach is fully blind as it does not require a prior
channel estimation step, prior knowledge of the number of transmitters, or any
signaling information. Our experimental results, loosely based on the LTE
random access channel, show that the proposed approach can effectively recover
the data-generating process for a wide range of scenarios, with varying number
of transmitters, number of receivers, constellation order, channel length, and
signal-to-noise ratio.Comment: 15 pages, 15 figure
A Compilation Target for Probabilistic Programming Languages
Forward inference techniques such as sequential Monte Carlo and particle
Markov chain Monte Carlo for probabilistic programming can be implemented in
any programming language by creative use of standardized operating system
functionality including processes, forking, mutexes, and shared memory.
Exploiting this we have defined, developed, and tested a probabilistic
programming language intermediate representation language we call probabilistic
C, which itself can be compiled to machine code by standard compilers and
linked to operating system libraries yielding an efficient, scalable, portable
probabilistic programming compilation target. This opens up a new hardware and
systems research path for optimizing probabilistic programming systems.Comment: In Proceedings of the 31st International Conference on Machine
Learning (ICML), 201
A Tutorial on Bayesian Nonparametric Models
A key problem in statistical modeling is model selection, how to choose a
model at an appropriate level of complexity. This problem appears in many
settings, most prominently in choosing the number ofclusters in mixture models
or the number of factors in factor analysis. In this tutorial we describe
Bayesian nonparametric methods, a class of methods that side-steps this issue
by allowing the data to determine the complexity of the model. This tutorial is
a high-level introduction to Bayesian nonparametric methods and contains
several examples of their application.Comment: 28 pages, 8 figure
- …