186 research outputs found

    Tensors: a Brief Introduction

    No full text
    International audienceTensor decompositions are at the core of many Blind Source Separation (BSS) algorithms, either explicitly or implicitly. In particular, the Canonical Polyadic (CP) tensor decomposition plays a central role in identification of underdetermined mixtures. Despite some similarities, CP and Singular value Decomposition (SVD) are quite different. More generally, tensors and matrices enjoy different properties, as pointed out in this brief survey

    Approximate Rank-Detecting Factorization of Low-Rank Tensors

    Full text link
    We present an algorithm, AROFAC2, which detects the (CP-)rank of a degree 3 tensor and calculates its factorization into rank-one components. We provide generative conditions for the algorithm to work and demonstrate on both synthetic and real world data that AROFAC2 is a potentially outperforming alternative to the gold standard PARAFAC over which it has the advantages that it can intrinsically detect the true rank, avoids spurious components, and is stable with respect to outliers and non-Gaussian noise

    Finding a low-rank basis in a matrix subspace

    Full text link
    For a given matrix subspace, how can we find a basis that consists of low-rank matrices? This is a generalization of the sparse vector problem. It turns out that when the subspace is spanned by rank-1 matrices, the matrices can be obtained by the tensor CP decomposition. For the higher rank case, the situation is not as straightforward. In this work we present an algorithm based on a greedy process applicable to higher rank problems. Our algorithm first estimates the minimum rank by applying soft singular value thresholding to a nuclear norm relaxation, and then computes a matrix with that rank using the method of alternating projections. We provide local convergence results, and compare our algorithm with several alternative approaches. Applications include data compression beyond the classical truncated SVD, computing accurate eigenvectors of a near-multiple eigenvalue, image separation and graph Laplacian eigenproblems

    Estimating multivariate latent-structure models

    Get PDF
    © Institute of Mathematical Statistics, 2016. A constructive proof of identification of multilinear decompositions of multiway arrays is presented. It can be applied to show identification in a variety of multivariate latent structures. Examples are finite-mixture models and hidden Markov models. The key step to show identification is the joint diagonalization of a set of matrices in the same nonorthogonal basis. An estimator of the latent-structure model may then be based on a sample version of this joint-diagonalization problem. Algorithms are available for computation and we derive distribution theory. We further develop asymptotic theory for orthogonal-series estimators of component densities in mixture models and emission densities in hidden Markov models.Supported by European Research Council Grant ERC-2010-StG-0263107-ENMUH. Supported by Sciences Po’s SAB grant “Nonparametric estimation of finite mixtures.” Supported by European Research Council Grant ERC-2010-AdG-269693-WASP and by Economic and Social Research Council Grant RES-589-28-0001 through the Centre for Microdata Methods and Practice
    corecore