461 research outputs found
Learning Mixtures of Gaussians in High Dimensions
Efficiently learning mixture of Gaussians is a fundamental problem in
statistics and learning theory. Given samples coming from a random one out of k
Gaussian distributions in Rn, the learning problem asks to estimate the means
and the covariance matrices of these Gaussians. This learning problem arises in
many areas ranging from the natural sciences to the social sciences, and has
also found many machine learning applications. Unfortunately, learning mixture
of Gaussians is an information theoretically hard problem: in order to learn
the parameters up to a reasonable accuracy, the number of samples required is
exponential in the number of Gaussian components in the worst case. In this
work, we show that provided we are in high enough dimensions, the class of
Gaussian mixtures is learnable in its most general form under a smoothed
analysis framework, where the parameters are randomly perturbed from an
adversarial starting point. In particular, given samples from a mixture of
Gaussians with randomly perturbed parameters, when n > {\Omega}(k^2), we give
an algorithm that learns the parameters with polynomial running time and using
polynomial number of samples. The central algorithmic ideas consist of new ways
to decompose the moment tensor of the Gaussian mixture by exploiting its
structural properties. The symmetries of this tensor are derived from the
combinatorial structure of higher order moments of Gaussian distributions
(sometimes referred to as Isserlis' theorem or Wick's theorem). We also develop
new tools for bounding smallest singular values of structured random matrices,
which could be useful in other smoothed analysis settings
A compactness theorem of -harmonic maps
For , let be a bounded domain in and be a compact
Riemannian manifold in without boundary. Suppose that are the Palais-Smale sequences of the Dirichlet -energy
functional and converges weakly in to a map . Then is a -harmonic map. In particular, the space of
-harmonic maps is sequentially compact for the weak -topology
Orientability and energy minimization in liquid crystal models
Uniaxial nematic liquid crystals are modelled in the Oseen-Frank theory
through a unit vector field . This theory has the apparent drawback that it
does not respect the head-to-tail symmetry in which should be equivalent to
-. This symmetry is preserved in the constrained Landau-de Gennes theory
that works with the tensor .We study
the differences and the overlaps between the two theories. These depend on the
regularity class used as well as on the topology of the underlying domain. We
show that for simply-connected domains and in the natural energy class
the two theories coincide, but otherwise there can be differences
between the two theories, which we identify. In the case of planar domains we
completely characterise the instances in which the predictions of the
constrained Landau-de Gennes theory differ from those of the Oseen-Frank
theory
- …