21,751 research outputs found
An Efficient Bayesian Inference Framework for Coalescent-Based Nonparametric Phylodynamics
Phylodynamics focuses on the problem of reconstructing past population size
dynamics from current genetic samples taken from the population of interest.
This technique has been extensively used in many areas of biology, but is
particularly useful for studying the spread of quickly evolving infectious
diseases agents, e.g.,\ influenza virus. Phylodynamics inference uses a
coalescent model that defines a probability density for the genealogy of
randomly sampled individuals from the population. When we assume that such a
genealogy is known, the coalescent model, equipped with a Gaussian process
prior on population size trajectory, allows for nonparametric Bayesian
estimation of population size dynamics. While this approach is quite powerful,
large data sets collected during infectious disease surveillance challenge the
state-of-the-art of Bayesian phylodynamics and demand computationally more
efficient inference framework. To satisfy this demand, we provide a
computationally efficient Bayesian inference framework based on Hamiltonian
Monte Carlo for coalescent process models. Moreover, we show that by splitting
the Hamiltonian function we can further improve the efficiency of this
approach. Using several simulated and real datasets, we show that our method
provides accurate estimates of population size dynamics and is substantially
faster than alternative methods based on elliptical slice sampler and
Metropolis-adjusted Langevin algorithm
An Efficient Primal-Dual Prox Method for Non-Smooth Optimization
We study the non-smooth optimization problems in machine learning, where both
the loss function and the regularizer are non-smooth functions. Previous
studies on efficient empirical loss minimization assume either a smooth loss
function or a strongly convex regularizer, making them unsuitable for
non-smooth optimization. We develop a simple yet efficient method for a family
of non-smooth optimization problems where the dual form of the loss function is
bilinear in primal and dual variables. We cast a non-smooth optimization
problem into a minimax optimization problem, and develop a primal dual prox
method that solves the minimax optimization problem at a rate of
{assuming that the proximal step can be efficiently solved}, significantly
faster than a standard subgradient descent method that has an
convergence rate. Our empirical study verifies the efficiency of the proposed
method for various non-smooth optimization problems that arise ubiquitously in
machine learning by comparing it to the state-of-the-art first order methods
- …