461 research outputs found

    Learning Mixtures of Gaussians in High Dimensions

    Full text link
    Efficiently learning mixture of Gaussians is a fundamental problem in statistics and learning theory. Given samples coming from a random one out of k Gaussian distributions in Rn, the learning problem asks to estimate the means and the covariance matrices of these Gaussians. This learning problem arises in many areas ranging from the natural sciences to the social sciences, and has also found many machine learning applications. Unfortunately, learning mixture of Gaussians is an information theoretically hard problem: in order to learn the parameters up to a reasonable accuracy, the number of samples required is exponential in the number of Gaussian components in the worst case. In this work, we show that provided we are in high enough dimensions, the class of Gaussian mixtures is learnable in its most general form under a smoothed analysis framework, where the parameters are randomly perturbed from an adversarial starting point. In particular, given samples from a mixture of Gaussians with randomly perturbed parameters, when n > {\Omega}(k^2), we give an algorithm that learns the parameters with polynomial running time and using polynomial number of samples. The central algorithmic ideas consist of new ways to decompose the moment tensor of the Gaussian mixture by exploiting its structural properties. The symmetries of this tensor are derived from the combinatorial structure of higher order moments of Gaussian distributions (sometimes referred to as Isserlis' theorem or Wick's theorem). We also develop new tools for bounding smallest singular values of structured random matrices, which could be useful in other smoothed analysis settings

    A compactness theorem of nn-harmonic maps

    Full text link
    For n≥3n\ge 3, let Ω\Omega be a bounded domain in RnR^n and NN be a compact Riemannian manifold in RLR^L without boundary. Suppose that un∈W1,n(Ω,N)u_n\in W^{1,n}(\Omega,N) are the Palais-Smale sequences of the Dirichlet nn-energy functional and unu_n converges weakly in W1,nW^{1,n} to a map u∈W1,n(Ω,N)u\in W^{1,n}(\Omega,N). Then uu is a nn-harmonic map. In particular, the space of nn-harmonic maps is sequentially compact for the weak W1,nW^{1,n}-topology

    Criando o Comum e Fraturando o Capitalismo_ uma troca de cartas entre Michael Hardt e John Holloway

    Get PDF

    Orientability and energy minimization in liquid crystal models

    Full text link
    Uniaxial nematic liquid crystals are modelled in the Oseen-Frank theory through a unit vector field nn. This theory has the apparent drawback that it does not respect the head-to-tail symmetry in which nn should be equivalent to -nn. This symmetry is preserved in the constrained Landau-de Gennes theory that works with the tensor Q=s(n⊗n−13Id)Q=s\big(n\otimes n- \frac{1}{3} Id\big).We study the differences and the overlaps between the two theories. These depend on the regularity class used as well as on the topology of the underlying domain. We show that for simply-connected domains and in the natural energy class W1,2W^{1,2} the two theories coincide, but otherwise there can be differences between the two theories, which we identify. In the case of planar domains we completely characterise the instances in which the predictions of the constrained Landau-de Gennes theory differ from those of the Oseen-Frank theory
    • …
    corecore