90 research outputs found

    Covariance Estimation in Elliptical Models with Convex Structure

    Full text link
    We address structured covariance estimation in Elliptical distribution. We assume it is a priori known that the covariance belongs to a given convex set, e.g., the set of Toeplitz or banded matrices. We consider the General Method of Moments (GMM) optimization subject to these convex constraints. Unfortunately, GMM is still non-convex due to objective. Instead, we propose COCA - a convex relaxation which can be efficiently solved. We prove that the relaxation is tight in the unconstrained case for a finite number of samples, and in the constrained case asymptotically. We then illustrate the advantages of COCA in synthetic simulations with structured Compound Gaussian distributions. In these examples, COCA outperforms competing methods as Tyler's estimate and its projection onto a convex set

    Joint Covariance Estimation with Mutual Linear Structure

    Full text link
    We consider the problem of joint estimation of structured covariance matrices. Assuming the structure is unknown, estimation is achieved using heterogeneous training sets. Namely, given groups of measurements coming from centered populations with different covariances, our aim is to determine the mutual structure of these covariance matrices and estimate them. Supposing that the covariances span a low dimensional affine subspace in the space of symmetric matrices, we develop a new efficient algorithm discovering the structure and using it to improve the estimation. Our technique is based on the application of principal component analysis in the matrix space. We also derive an upper performance bound of the proposed algorithm in the Gaussian scenario and compare it with the Cramer-Rao lower bound. Numerical simulations are presented to illustrate the performance benefits of the proposed method

    Compressed matched filter for non-Gaussian noise

    Full text link
    We consider estimation of a deterministic unknown parameter vector in a linear model with non-Gaussian noise. In the Gaussian case, dimensionality reduction via a linear matched filter provides a simple low dimensional sufficient statistic which can be easily communicated and/or stored for future inference. Such a statistic is usually unknown in the general non-Gaussian case. Instead, we propose a hybrid matched filter coupled with a randomized compressed sensing procedure, which together create a low dimensional statistic. We also derive a complementary algorithm for robust reconstruction given this statistic. Our recovery method is based on the fast iterative shrinkage and thresholding algorithm which is used for outlier rejection given the compressed data. We demonstrate the advantages of the proposed framework using synthetic simulations

    Group Symmetry and non-Gaussian Covariance Estimation

    Full text link
    We consider robust covariance estimation with group symmetry constraints. Non-Gaussian covariance estimation, e.g., Tyler scatter estimator and Multivariate Generalized Gaussian distribution methods, usually involve non-convex minimization problems. Recently, it was shown that the underlying principle behind their success is an extended form of convexity over the geodesics in the manifold of positive definite matrices. A modern approach to improve estimation accuracy is to exploit prior knowledge via additional constraints, e.g., restricting the attention to specific classes of covariances which adhere to prior symmetry structures. In this paper, we prove that such group symmetry constraints are also geodesically convex and can therefore be incorporated into various non-Gaussian covariance estimators. Practical examples of such sets include: circulant, persymmetric and complex/quaternion proper structures. We provide a simple numerical technique for finding maximum likelihood estimates under such constraints, and demonstrate their performance advantage using synthetic experiments

    Tyler's Covariance Matrix Estimator in Elliptical Models with Convex Structure

    Full text link
    We address structured covariance estimation in elliptical distributions by assuming that the covariance is a priori known to belong to a given convex set, e.g., the set of Toeplitz or banded matrices. We consider the General Method of Moments (GMM) optimization applied to robust Tyler's scatter M-estimator subject to these convex constraints. Unfortunately, GMM turns out to be non-convex due to the objective. Instead, we propose a new COCA estimator - a convex relaxation which can be efficiently solved. We prove that the relaxation is tight in the unconstrained case for a finite number of samples, and in the constrained case asymptotically. We then illustrate the advantages of COCA in synthetic simulations with structured compound Gaussian distributions. In these examples, COCA outperforms competing methods such as Tyler's estimator and its projection onto the structure set.Comment: arXiv admin note: text overlap with arXiv:1311.059

    Decomposable Principal Component Analysis

    Full text link
    We consider principal component analysis (PCA) in decomposable Gaussian graphical models. We exploit the prior information in these models in order to distribute its computation. For this purpose, we reformulate the problem in the sparse inverse covariance (concentration) domain and solve the global eigenvalue problem using a sequence of local eigenvalue problems in each of the cliques of the decomposable graph. We demonstrate the application of our methodology in the context of decentralized anomaly detection in the Abilene backbone network. Based on the topology of the network, we propose an approximate statistical graphical model and distribute the computation of PCA
    • …
    corecore