1,152 research outputs found
Group Symmetry and non-Gaussian Covariance Estimation
We consider robust covariance estimation with group symmetry constraints.
Non-Gaussian covariance estimation, e.g., Tyler scatter estimator and
Multivariate Generalized Gaussian distribution methods, usually involve
non-convex minimization problems. Recently, it was shown that the underlying
principle behind their success is an extended form of convexity over the
geodesics in the manifold of positive definite matrices. A modern approach to
improve estimation accuracy is to exploit prior knowledge via additional
constraints, e.g., restricting the attention to specific classes of covariances
which adhere to prior symmetry structures. In this paper, we prove that such
group symmetry constraints are also geodesically convex and can therefore be
incorporated into various non-Gaussian covariance estimators. Practical
examples of such sets include: circulant, persymmetric and complex/quaternion
proper structures. We provide a simple numerical technique for finding maximum
likelihood estimates under such constraints, and demonstrate their performance
advantage using synthetic experiments
Multivariate Generalized Gaussian Distribution: Convexity and Graphical Models
We consider covariance estimation in the multivariate generalized Gaussian
distribution (MGGD) and elliptically symmetric (ES) distribution. The maximum
likelihood optimization associated with this problem is non-convex, yet it has
been proved that its global solution can be often computed via simple fixed
point iterations. Our first contribution is a new analysis of this likelihood
based on geodesic convexity that requires weaker assumptions. Our second
contribution is a generalized framework for structured covariance estimation
under sparsity constraints. We show that the optimizations can be formulated as
convex minimization as long the MGGD shape parameter is larger than half and
the sparsity pattern is chordal. These include, for example, maximum likelihood
estimation of banded inverse covariances in multivariate Laplace distributions,
which are associated with time varying autoregressive processes
Covariance Estimation in Elliptical Models with Convex Structure
We address structured covariance estimation in Elliptical distribution. We
assume it is a priori known that the covariance belongs to a given convex set,
e.g., the set of Toeplitz or banded matrices. We consider the General Method of
Moments (GMM) optimization subject to these convex constraints. Unfortunately,
GMM is still non-convex due to objective. Instead, we propose COCA - a convex
relaxation which can be efficiently solved. We prove that the relaxation is
tight in the unconstrained case for a finite number of samples, and in the
constrained case asymptotically. We then illustrate the advantages of COCA in
synthetic simulations with structured Compound Gaussian distributions. In these
examples, COCA outperforms competing methods as Tyler's estimate and its
projection onto a convex set
Robust subspace recovery by Tyler's M-estimator
This paper considers the problem of robust subspace recovery: given a set of
points in , if many lie in a -dimensional subspace, then
can we recover the underlying subspace? We show that Tyler's M-estimator can be
used to recover the underlying subspace, if the percentage of the inliers is
larger than and the data points lie in general position. Empirically,
Tyler's M-estimator compares favorably with other convex subspace recovery
algorithms in both simulations and experiments on real data sets
- …