2,363 research outputs found
Conditionally conjugate mean-field variational Bayes for logistic models
Variational Bayes (VB) is a common strategy for approximate Bayesian
inference, but simple methods are only available for specific classes of models
including, in particular, representations having conditionally conjugate
constructions within an exponential family. Models with logit components are an
apparently notable exception to this class, due to the absence of conjugacy
between the logistic likelihood and the Gaussian priors for the coefficients in
the linear predictor. To facilitate approximate inference within this widely
used class of models, Jaakkola and Jordan (2000) proposed a simple variational
approach which relies on a family of tangent quadratic lower bounds of logistic
log-likelihoods, thus restoring conjugacy between these approximate bounds and
the Gaussian priors. This strategy is still implemented successfully, but less
attempts have been made to formally understand the reasons underlying its
excellent performance. To cover this key gap, we provide a formal connection
between the above bound and a recent P\'olya-gamma data augmentation for
logistic regression. Such a result places the computational methods associated
with the aforementioned bounds within the framework of variational inference
for conditionally conjugate exponential family models, thereby allowing recent
advances for this class to be inherited also by the methods relying on Jaakkola
and Jordan (2000)
Hierarchically-coupled hidden Markov models for learning kinetic rates from single-molecule data
We address the problem of analyzing sets of noisy time-varying signals that
all report on the same process but confound straightforward analyses due to
complex inter-signal heterogeneities and measurement artifacts. In particular
we consider single-molecule experiments which indirectly measure the distinct
steps in a biomolecular process via observations of noisy time-dependent
signals such as a fluorescence intensity or bead position. Straightforward
hidden Markov model (HMM) analyses attempt to characterize such processes in
terms of a set of conformational states, the transitions that can occur between
these states, and the associated rates at which those transitions occur; but
require ad-hoc post-processing steps to combine multiple signals. Here we
develop a hierarchically coupled HMM that allows experimentalists to deal with
inter-signal variability in a principled and automatic way. Our approach is a
generalized expectation maximization hyperparameter point estimation procedure
with variational Bayes at the level of individual time series that learns an
single interpretable representation of the overall data generating process.Comment: 9 pages, 5 figure
- …