10 research outputs found
Bayesian Pose Graph Optimization via Bingham Distributions and Tempered Geodesic MCMC
We introduce Tempered Geodesic Markov Chain Monte Carlo (TG-MCMC) algorithm
for initializing pose graph optimization problems, arising in various scenarios
such as SFM (structure from motion) or SLAM (simultaneous localization and
mapping). TG-MCMC is first of its kind as it unites asymptotically global
non-convex optimization on the spherical manifold of quaternions with posterior
sampling, in order to provide both reliable initial poses and uncertainty
estimates that are informative about the quality of individual solutions. We
devise rigorous theoretical convergence guarantees for our method and
extensively evaluate it on synthetic and real benchmark datasets. Besides its
elegance in formulation and theory, we show that our method is robust to
missing data, noise and the estimated uncertainties capture intuitive
properties of the data.Comment: Published at NeurIPS 2018, 25 pages with supplement
Mirrored Langevin Dynamics
We consider the problem of sampling from constrained distributions, which has posed significant challenges to both non-asymptotic analysis and algorithmic design. We propose a unified framework, which is inspired by the classical mirror descent, to derive novel first- order sampling schemes. We prove that, for a general target distribution with strongly convex potential, our framework implies the existence of a first-order algorithm achieving O ̃(ε{−2}d) convergence, suggesting that the state-of-the-art O ̃(ε{−6}d^5) can be vastly improved. With the important Latent Dirichlet Allocation (LDA) application in mind, we specialize our algorithm to sample from Dirichlet posteriors, and derive the first non-asymptotic O ̃(ε^{−2}d^2) rate for first-order sampling. We further extend our framework to the mini-batch setting and prove convergence rates when only stochastic gradients are available. Finally, we report promising experimental results for LDA on real datasets
Generalized Fiducial Inference on Differentiable Manifolds
We introduce a novel approach to inference on parameters that take values in
a Riemannian manifold embedded in a Euclidean space. Parameter spaces of this
form are ubiquitous across many fields, including chemistry, physics, computer
graphics, and geology. This new approach uses generalized fiducial inference to
obtain a posterior-like distribution on the manifold, without needing to know a
parameterization that maps the constrained space to an unconstrained Euclidean
space. The proposed methodology, called the constrained generalized fiducial
distribution (CGFD), is obtained by using mathematical tools from Riemannian
geometry. A Bernstein-von Mises-type result for the CGFD, which provides
intuition for how the desirable asymptotic qualities of the unconstrained
generalized fiducial distribution are inherited by the CGFD, is provided. To
demonstrate the practical use of the CGFD, we provide three proof-of-concept
examples: inference for data from a multivariate normal density with the mean
parameters on a sphere, a linear logspline density estimation problem, and a
reimagined approach to the AR(1) model, all of which exhibit desirable
coverages via simulation. We discuss two Markov chain Monte Carlo algorithms
for the exploration of these constrained parameter spaces and adapt them for
the CGFD.Comment: 31 pages, 7 figure
Riemannian Langevin Algorithm for Solving Semidefinite Programs
We propose a Langevin diffusion-based algorithm for non-convex optimization
and sampling on a product manifold of spheres. Under a logarithmic Sobolev
inequality, we establish a guarantee for finite iteration convergence to the
Gibbs distribution in terms of Kullback--Leibler divergence. We show that with
an appropriate temperature choice, the suboptimality gap to the global minimum
is guaranteed to be arbitrarily small with high probability.
As an application, we consider the Burer--Monteiro approach for solving a
semidefinite program (SDP) with diagonal constraints, and analyze the proposed
Langevin algorithm for optimizing the non-convex objective. In particular, we
establish a logarithmic Sobolev inequality for the Burer--Monteiro problem when
there are no spurious local minima, but under the presence saddle points.
Combining the results, we then provide a global optimality guarantee for the
SDP and the Max-Cut problem. More precisely, we show that the Langevin
algorithm achieves accuracy with high probability in
iterations
The bracket geometry of statistics
In this thesis we build a geometric theory of Hamiltonian Monte Carlo, with an emphasis on symmetries and its bracket generalisations, construct the canonical geometry of smooth measures and Stein operators, and derive the complete recipe of measure-constraints preserving dynamics and diffusions on arbitrary manifolds.
Specifically, we will explain the central role played by mechanics with symmetries to obtain efficient numerical integrators, and provide a general method to construct explicit integrators for HMC on geodesic orbit manifolds via symplectic reduction.
Following ideas developed by Maxwell, Volterra, Poincaré, de Rham, Koszul, Dufour, Weinstein, and others,
we will then show that any smooth distribution generates
considerable geometric content, including ``musical"
isomorphisms between multi-vector fields and twisted differential forms, and
a boundary operator - the rotationnel,
which, in particular, engenders the canonical Stein operator.
We then introduce the ``bracket formalism" and its induced mechanics, a generalisation of Poisson mechanics and gradient flows that provides a general mechanism to associate unnormalised probability densities to flows depending on the score pointwise.
Most importantly, we will characterise all measure-constraints preserving flows on arbitrary manifolds, showing the intimate relation between measure-preserving Nambu mechanics and closed twisted forms.
Our results are canonical. As a special case we obtain the characterisation of measure-preserving bracket mechanical systems and measure-preserving diffusions, thus explaining and extending to manifolds
the complete recipe of SGMCMC.
We will discuss the geometry of Stein operators and extend the density approach by showing these are simply a reformulation of the exterior derivative on twisted forms satisfying Stokes' theorem.
Combining the canonical Stein operator with brackets allows us to naturally recover the Riemannian and diffusion Stein operators as special cases.
Finally, we shall introduce the minimum Stein discrepancy estimators, which provide a unifying perspective of parameter inference based on score matching, contrastive divergence, and minimum probability flow.Open Acces