7 research outputs found

    Joint Independent Subspace Analysis Using Second-Order Statistics

    No full text
    International audienceThis paper deals with a novel generalization of classical blind source separation (BSS) in two directions. First, relaxing the constraint that the latent sources must be statistically independent. This generalization is well-known and sometimes termed independent subspace analysis (ISA). Second, jointly analyzing several ISA problems, where the link is due to statistical dependence among corresponding sources in different mixtures. When the data are one-dimensional, i.e., multiple classical BSS problems, this model, known as independent vector analysis (IVA), has already been studied. In this paper, we combine IVA with ISA and term this new model joint independent subspace analysis (JISA). We provide full performance analysis of JISA, including closed-form expressions for minimal mean square error (MSE), Fisher information and Cramér-Rao lower bound, in the separation of Gaussian data. The derived MSE applies also for non-Gaussian data, when only second-order statistics are used. We generalize previously known results on IVA, including its ability to uniquely resolve instantaneous mixtures of real Gaussian stationary data, and having the same arbitrary permutation at all mixtures. Numerical experiments validate our theoretical results and show the gain with respect to two competing approaches that either use a finer block partition or a different norm

    An Alternative Proof for the Identifiability of Independent Vector Analysis Using Second Order Statistics

    No full text
    International audienceIn this paper, we present an alternative proof for characterizing the (non-) identifiability conditions of independent vector analysis (IVA). IVA extends blind source separation to several mixtures by taking into account statistical dependencies between mixtures. We focus on IVA in the presence of real Gaussian data with temporally independent and identically distributed samples. This model is always non-identifiable when each mixture is considered separately. However, it can be shown to be generically identifiable within the IVA framework. Our proof differs from previous ones by being based on direct factorization of a closed-form expression for the Fisher information matrix. Our analysis is based on a rigorous linear algebraic formulation, and leads to a new type of factorization of a structured matrix. Therefore, the proposed approach is of potential interest for a broader range of problems

    Approximate joint diagonalization with Riemannian optimization on the general linear group

    Get PDF
    International audienceWe consider the classical problem of approximate joint diagonalization of matrices, which can be cast as an optimization problem on the general linear group. We propose a versatile Riemannian optimization framework for solving this problem-unifiying existing methods and creating new ones. We use two standard Riemannian metrics (left-and right-invariant metrics) having opposite features regarding the structure of solutions and the model. We introduce the Riemannian optimization tools (gradient, retraction, vector transport) in this context, for the two standard non-degeneracy constraints (oblique and non-holonomic constraints). We also develop tools beyond the classical Riemannian optimization framework to handle the non-Riemannian quotient manifold induced by the non-holonomic constraint with the right-invariant metric. We illustrate our theoretical developments with numerical experiments on both simulated data and a real electroencephalographic recording

    Cram\'er-Rao Bounds for Complex-Valued Independent Component Extraction: Determined and Piecewise Determined Mixing Models

    Full text link
    This paper presents Cram\'er-Rao Lower Bound (CRLB) for the complex-valued Blind Source Extraction (BSE) problem based on the assumption that the target signal is independent of the other signals. Two instantaneous mixing models are considered. First, we consider the standard determined mixing model used in Independent Component Analysis (ICA) where the mixing matrix is square and non-singular and the number of the latent sources is the same as that of the observed signals. The CRLB for Independent Component Extraction (ICE) where the mixing matrix is re-parameterized in order to extract only one independent target source is computed. The target source is assumed to be non-Gaussian or non-circular Gaussian while the other signals (background) are circular Gaussian or non-Gaussian. The results confirm some previous observations known for the real domain and bring new results for the complex domain. Also, the CRLB for ICE is shown to coincide with that for ICA when the non-Gaussianity of background is taken into account. %unless the assumed sources' distributions are misspecified. Second, we extend the CRLB analysis to piecewise determined mixing models. Here, the observed signals are assumed to obey the determined mixing model within short blocks where the mixing matrices can be varying from block to block. However, either the mixing vector or the separating vector corresponding to the target source is assumed to be constant across the blocks. The CRLBs for the parameters of these models bring new performance bounds for the BSE problem.Comment: 25 pages, 8 figure

    A Generalization to Schur's Lemma with an Application to Joint Independent Subspace Analysis

    No full text
    This paper has a threefold contribution. First, it introduces a generalization to Schur's lemma from 1905 on irreducible representations. Second, it provides a comprehensive uniqueness analysis to a recently-introduced source separation model. Third, it reinforces the link between signal processing and representation theory, a field of algebra that is more often associated with quantum mechanics than signal processing. The source separation model that this paper relies on performs joint independent subspace analysis (JISA) using second order statistics. In previous work, we derived the Fisher information matrix (FIM) that corresponds to this model. The uniqueness analysis in this paper is based on analysing the FIM, where the core of the derivation is based on our proposed generalization to Schur's lemma. We provide proof both to the new lemma and to the uniqueness conditions. From a different perspective, the generalization to Schur's lemma is inspired by a coupled matrix block diagonalization problem that arises from the JISA model. The results in this paper generalize previous results about identifiability of independent vector analysis (IVA). This paper complements previously-known results on the uniqueness of joint block diagonalization (JBD) and block term decompositions (BTD), as well as of their coupled counterparts
    corecore