1,013 research outputs found

    Numerical methods for a Kohn-Sham density functional model based on optimal transport

    Full text link
    In this paper, we study numerical discretizations to solve density functional models in the "strictly correlated electrons" (SCE) framework. Unlike previous studies our work is not restricted to radially symmetric densities. In the SCE framework, the exchange-correlation functional encodes the effects of the strong correlation regime by minimizing the pairwise Coulomb repulsion, resulting in an optimal transport problem. We give a mathematical derivation of the self-consistent Kohn-Sham-SCE equations, construct an efficient numerical discretization for this type of problem for N = 2 electrons, and apply it to the H2 molecule in its dissociating limit. Moreover, we prove that the SCE density functional model is correct for the H2 molecule in its dissociating limit.Comment: 22 pages, 6 figure

    A Numerical Method to solve Optimal Transport Problems with Coulomb Cost

    Get PDF
    In this paper, we present a numerical method, based on iterative Bregman projections, to solve the optimal transport problem with Coulomb cost. This is related to the strong interaction limit of Density Functional Theory. The first idea is to introduce an entropic regularization of the Kantorovich formulation of the Optimal Transport problem. The regularized problem then corresponds to the projection of a vector on the intersection of the constraints with respect to the Kullback-Leibler distance. Iterative Bregman projections on each marginal constraint are explicit which enables us to approximate the optimal transport plan. We validate the numerical method against analytical test cases

    Learning Arbitrary Statistical Mixtures of Discrete Distributions

    Get PDF
    We study the problem of learning from unlabeled samples very general statistical mixture models on large finite sets. Specifically, the model to be learned, ϑ\vartheta, is a probability distribution over probability distributions pp, where each such pp is a probability distribution over [n]={1,2,…,n}[n] = \{1,2,\dots,n\}. When we sample from ϑ\vartheta, we do not observe pp directly, but only indirectly and in very noisy fashion, by sampling from [n][n] repeatedly, independently KK times from the distribution pp. The problem is to infer ϑ\vartheta to high accuracy in transportation (earthmover) distance. We give the first efficient algorithms for learning this mixture model without making any restricting assumptions on the structure of the distribution ϑ\vartheta. We bound the quality of the solution as a function of the size of the samples KK and the number of samples used. Our model and results have applications to a variety of unsupervised learning scenarios, including learning topic models and collaborative filtering.Comment: 23 pages. Preliminary version in the Proceeding of the 47th ACM Symposium on the Theory of Computing (STOC15
    • …
    corecore