162 research outputs found

    Convergence Rate of Riemannian Hamiltonian Monte Carlo and Faster Polytope Volume Computation

    Full text link
    We give the first rigorous proof of the convergence of Riemannian Hamiltonian Monte Carlo, a general (and practical) method for sampling Gibbs distributions. Our analysis shows that the rate of convergence is bounded in terms of natural smoothness parameters of an associated Riemannian manifold. We then apply the method with the manifold defined by the log barrier function to the problems of (1) uniformly sampling a polytope and (2) computing its volume, the latter by extending Gaussian cooling to the manifold setting. In both cases, the total number of steps needed is O^{*}(mn^{\frac{2}{3}}), improving the state of the art. A key ingredient of our analysis is a proof of an analog of the KLS conjecture for Gibbs distributions over manifolds

    Condition-number-independent convergence rate of Riemannian Hamiltonian Monte Carlo with numerical integrators

    Full text link
    We study the convergence rate of discretized Riemannian Hamiltonian Monte Carlo on sampling from distributions in the form of e−f(x)e^{-f(x)} on a convex body M⊂Rn\mathcal{M}\subset\mathbb{R}^{n}. We show that for distributions in the form of e−α⊀xe^{-\alpha^{\top}x} on a polytope with mm constraints, the convergence rate of a family of commonly-used integrators is independent of ∄α∄2\left\Vert \alpha\right\Vert _{2} and the geometry of the polytope. In particular, the implicit midpoint method (IMM) and the generalized Leapfrog method (LM) have a mixing time of O~(mn3)\widetilde{O}\left(mn^{3}\right) to achieve Ï”\epsilon total variation distance to the target distribution. These guarantees are based on a general bound on the convergence rate for densities of the form e−f(x)e^{-f(x)} in terms of parameters of the manifold and the integrator. Our theoretical guarantee complements the empirical results of [KLSV22], which shows that RHMC with IMM can sample ill-conditioned, non-smooth and constrained distributions in very high dimension efficiently in practice.Comment: Improved writing & Theory for arXiv:2202.0190

    Practical Volume Estimation by a New Annealing Schedule for Cooling Convex Bodies

    Full text link
    We study the problem of estimating the volume of convex polytopes, focusing on H- and V-polytopes, as well as zonotopes. Although a lot of effort is devoted to practical algorithms for H-polytopes there is no such method for the latter two representations. We propose a new, practical algorithm for all representations, which is faster than existing methods. It relies on Hit-and-Run sampling, and combines a new simulated annealing method with the Multiphase Monte Carlo (MMC) approach. Our method introduces the following key features to make it adaptive: (a) It defines a sequence of convex bodies in MMC by introducing a new annealing schedule, whose length is shorter than in previous methods with high probability, and the need of computing an enclosing and an inscribed ball is removed; (b) It exploits statistical properties in rejection-sampling and proposes a better empirical convergence criterion for specifying each step; (c) For zonotopes, it may use a sequence of convex bodies for MMC different than balls, where the chosen body adapts to the input. We offer an open-source, optimized C++ implementation, and analyze its performance to show that it outperforms state-of-the-art software for H-polytopes by Cousins-Vempala (2016) and Emiris-Fisikopoulos (2018), while it undertakes volume computations that were intractable until now, as it is the first polynomial-time, practical method for V-polytopes and zonotopes that scales to high dimensions (currently 100). We further focus on zonotopes, and characterize them by their order (number of generators over dimension), because this largely determines sampling complexity. We analyze a related application, where we evaluate methods of zonotope approximation in engineering.Comment: 20 pages, 12 figures, 3 table

    Optimal Convergence Rate of Hamiltonian Monte Carlo for Strongly Logconcave Distributions

    Get PDF
    We study Hamiltonian Monte Carlo (HMC) for sampling from a strongly logconcave density proportional to e^{-f} where f:R^d -> R is mu-strongly convex and L-smooth (the condition number is kappa = L/mu). We show that the relaxation time (inverse of the spectral gap) of ideal HMC is O(kappa), improving on the previous best bound of O(kappa^{1.5}); we complement this with an example where the relaxation time is Omega(kappa). When implemented using a nearly optimal ODE solver, HMC returns an epsilon-approximate point in 2-Wasserstein distance using O~((kappa d)^{0.5} epsilon^{-1}) gradient evaluations per step and O~((kappa d)^{1.5}epsilon^{-1}) total time

    Geometric Algorithms for Sampling the Flux Space of Metabolic Networks

    Get PDF
    Systems Biology is a fundamental field and paradigm that introduces a new era in Biology. The crux of its functionality and usefulness relies on metabolic networks that model the reactions occurring inside an organism and provide the means to understand the underlying mechanisms that govern biological systems. Even more, metabolic networks have a broader impact that ranges from resolution of ecosystems to personalized medicine. The analysis of metabolic networks is a computational geometry oriented field as one of the main operations they depend on is sampling uniformly points from polytopes; the latter provides a representation of the steady states of the metabolic networks. However, the polytopes that result from biological data are of very high dimension (to the order of thousands) and in most, if not all, the cases are considerably skinny. Therefore, to perform uniform random sampling efficiently in this setting, we need a novel algorithmic and computational framework specially tailored for the properties of metabolic networks. We present a complete software framework to handle sampling in metabolic networks. Its backbone is a Multiphase Monte Carlo Sampling (MMCS) algorithm that unifies rounding and sampling in one pass, obtaining both upon termination. It exploits an improved variant of the Billiard Walk that enjoys faster arithmetic complexity per step. We demonstrate the efficiency of our approach by performing extensive experiments on various metabolic networks. Notably, sampling on the most complicated human metabolic network accessible today, Recon3D, corresponding to a polytope of dimension 5335, took less than 30 hours. To our knowledge, that is out of reach for existing software

    Amélioration des calculs de volume de polytope basés sur le Monte Carlo Hamiltonien avec des réflexions sur les bords et des arithmétiques édulcorées

    Get PDF
    International audienceComputing the volume of a high dimensional polytope is a fundamental problem in geometry, also connected to the calculation of densities of states in statistical physics, and a central building block of such algorithms is the method used to sample a target probability distribution. This paper studies Hamiltonian Monte Carlo (HMC) with reflections on the boundary of a domain, providing an enhanced alternative to Hit-and-run (HAR) to sample a target distribution restricted to the polytope. We make three contributions. First, we provide a convergence bound, paving the way to more precise mixing time analysis. Second, we present a robust implementation based on multi-precision arithmetic, a mandatory ingredient to guarantee exact predicates and robust constructions. We however allow controlled failures to happen, introducing the Sweeten Exact Geometric Computing (SEGC) paradigm. Third, we use our HMC random walk to perform H-polytope volume calculations, using it as an alternative to HAR within the volume algorithm by Cousins and Vempala. The systematic tests conducted up to dimension nn = 100 on the cube, the isotropic and the standard simplex show that HMC significantly outperforms HAR both in terms of accuracy and running time. Additional tests show that calculations may be handled up to dimension nn = 500. These tests also establish that multiprecision is mandatory to avoid exits from the polytope
    • 

    corecore