1,632 research outputs found
Hidden Semi Markov Models for Multiple Observation Sequences: The mhsmm Package for R
This paper describes the R package mhsmm which implements estimation and prediction methods for hidden Markov and semi-Markov models for multiple observation sequences. Such techniques are of interest when observed data is thought to be dependent on some unobserved (or hidden) state. Hidden Markov models only allow a geometrically distributed sojourn time in a given state, while hidden semi-Markov models extend this by allowing an arbitrary sojourn distribution. We demonstrate the software with simulation examples and an application involving the modelling of the ovarian cycle of dairy cows.
Fast Automatic Bayesian Cubature Using Matching Kernels and Designs
Automatic cubatures approximate integrals to user-specified error tolerances.
For high dimensional problems, it is difficult to adaptively change the
sampling pattern to focus on peaks because peaks can hide more easily in high
dimensional space. But, one can automatically determine the sample size, ,
given a reasonable, fixed sampling pattern. This approach is pursued in
Jagadeeswaran and Hickernell, Stat.\ Comput., 29:1214-1229, 2019, where a
Bayesian perspective is used to construct a credible interval for the integral,
and the computation is terminated when the half-width of the interval is no
greater than the required error tolerance. Our earlier work employs integration
lattice sampling, and the computations are expedited by the fast Fourier
transform because the covariance kernels for the Gaussian process prior on the
integrand are chosen to be shift-invariant. In this chapter, we extend our fast
automatic Bayesian cubature to digital net sampling via \emph{digitally}
shift-invariant covariance kernels and fast Walsh transforms.
Our algorithm is implemented in the MATLAB Guaranteed Automatic Integration
Library (GAIL) and the QMCPy Python library.Comment: PhD thesi
Building Proteins in a Day: Efficient 3D Molecular Reconstruction
Discovering the 3D atomic structure of molecules such as proteins and viruses
is a fundamental research problem in biology and medicine. Electron
Cryomicroscopy (Cryo-EM) is a promising vision-based technique for structure
estimation which attempts to reconstruct 3D structures from 2D images. This
paper addresses the challenging problem of 3D reconstruction from 2D Cryo-EM
images. A new framework for estimation is introduced which relies on modern
stochastic optimization techniques to scale to large datasets. We also
introduce a novel technique which reduces the cost of evaluating the objective
function during optimization by over five orders or magnitude. The net result
is an approach capable of estimating 3D molecular structure from large scale
datasets in about a day on a single workstation.Comment: To be presented at IEEE Conference on Computer Vision and Pattern
Recognition (CVPR) 201
Kernel-Based Just-In-Time Learning for Passing Expectation Propagation Messages
We propose an efficient nonparametric strategy for learning a message
operator in expectation propagation (EP), which takes as input the set of
incoming messages to a factor node, and produces an outgoing message as output.
This learned operator replaces the multivariate integral required in classical
EP, which may not have an analytic expression. We use kernel-based regression,
which is trained on a set of probability distributions representing the
incoming messages, and the associated outgoing messages. The kernel approach
has two main advantages: first, it is fast, as it is implemented using a novel
two-layer random feature representation of the input message distributions;
second, it has principled uncertainty estimates, and can be cheaply updated
online, meaning it can request and incorporate new training data when it
encounters inputs on which it is uncertain. In experiments, our approach is
able to solve learning problems where a single message operator is required for
multiple, substantially different data sets (logistic regression for a variety
of classification problems), where it is essential to accurately assess
uncertainty and to efficiently and robustly update the message operator.Comment: accepted to UAI 2015. Correct typos. Add more content to the
appendix. Main results unchange
Hidden Semi Markov Models for Multiple Observation Sequences: The mhsmm Package for R
This paper describes the R package mhsmm which implements estimation and prediction methods for hidden Markov and semi-Markov models for multiple observation sequences. Such techniques are of interest when observed data is thought to be dependent on some unobserved (or hidden) state. Hidden Markov models only allow a geometrically distributed sojourn time in a given state, while hidden semi-Markov models extend this by allowing an arbitrary sojourn distribution. We demonstrate the software with simulation examples and an application involving the modelling of the ovarian cycle of dairy cows
A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms
The benefits of automating design cycles for Bayesian inference-based
algorithms are becoming increasingly recognized by the machine learning
community. As a result, interest in probabilistic programming frameworks has
much increased over the past few years. This paper explores a specific
probabilistic programming paradigm, namely message passing in Forney-style
factor graphs (FFGs), in the context of automated design of efficient Bayesian
signal processing algorithms. To this end, we developed "ForneyLab"
(https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message
passing-based inference in FFGs. We show by example how ForneyLab enables
automatic derivation of Bayesian signal processing algorithms, including
algorithms for parameter estimation and model comparison. Crucially, due to the
modular makeup of the FFG framework, both the model specification and inference
methods are readily extensible in ForneyLab. In order to test this framework,
we compared variational message passing as implemented by ForneyLab with
automatic differentiation variational inference (ADVI) and Monte Carlo methods
as implemented by state-of-the-art tools "Edward" and "Stan". In terms of
performance, extensibility and stability issues, ForneyLab appears to enjoy an
edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate
Reasonin
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments
- …