933 research outputs found

    Joint Independent Subspace Analysis Using Second-Order Statistics

    No full text
    International audienceThis paper deals with a novel generalization of classical blind source separation (BSS) in two directions. First, relaxing the constraint that the latent sources must be statistically independent. This generalization is well-known and sometimes termed independent subspace analysis (ISA). Second, jointly analyzing several ISA problems, where the link is due to statistical dependence among corresponding sources in different mixtures. When the data are one-dimensional, i.e., multiple classical BSS problems, this model, known as independent vector analysis (IVA), has already been studied. In this paper, we combine IVA with ISA and term this new model joint independent subspace analysis (JISA). We provide full performance analysis of JISA, including closed-form expressions for minimal mean square error (MSE), Fisher information and Cramér-Rao lower bound, in the separation of Gaussian data. The derived MSE applies also for non-Gaussian data, when only second-order statistics are used. We generalize previously known results on IVA, including its ability to uniquely resolve instantaneous mixtures of real Gaussian stationary data, and having the same arbitrary permutation at all mixtures. Numerical experiments validate our theoretical results and show the gain with respect to two competing approaches that either use a finer block partition or a different norm

    Joint Independent Subspace Analysis: A Quasi-Newton Algorithm

    No full text
    International audienceIn this paper, we present a quasi-Newton (QN) algorithm for joint independent subspace analysis (JISA). JISA is a recently proposed generalization of independent vector analysis (IVA). JISA extends classical blind source separation (BSS) to jointly resolve several BSS problems by exploiting statistical dependence between latent sources across mixtures, as well as relaxing the assumption of statistical independence within each mixture. Algebraically, JISA based on second-order statistics amounts to coupled block diagonalization of a set of covariance and cross-covariance matrices, as well as block diagonalization of a single permuted covariance matrix. The proposed QN algorithm achieves asymptotically the minimal mean square error (MMSE) in the separation of multidimensional Gaussian components. Numerical experiments demonstrate convergence and source separation properties of the proposed algorithm

    Joint Analysis of Multiple Datasets by Cross-Cumulant Tensor (Block) Diagonalization

    No full text
    International audienceIn this paper, we propose approximate diagonalization of a cross-cumulant tensor as a means to achieve independent component analysis (ICA) in several linked datasets. This approach generalizes existing cumulant-based independent vector analysis (IVA). It leads to uniqueness, identifiability and resilience to noise that exceed those in the literature, in certain scenarios. The proposed method can achieve blind identification of underdetermined mixtures when single-dataset cumulant-based methods that use the same order of statistics fall short. In addition, it is possible to analyse more than two datasets in a single tensor factorization. The proposed approach readily extends to independent subspace analysis (ISA), by tensor block-diagonalization. The proposed approach can be used as-is or as an ingredient in various data fusion frameworks, using coupled decompositions. The core idea can be used to generalize existing ICA methods from one dataset to an ensemble

    Joint Blind Source Separation of Multidimensional Components: Model and Algorithm

    No full text
    International audienceThis paper deals with joint blind source separation (JBSS) of multidimensional components. JBSS extends classical BSS to simultaneously resolve several BSS problems by assuming statistical dependence between latent sources across mixtures. JBSS offers some significant advantages over BSS, such as identifying more than one Gaussian white stationary source within a mixture. Multidimensional BSS extends classical BSS to deal with a more general and more flexible model within each mixture: the sources can be partitioned into groups exhibiting dependence within a given group but independence between two different groups. Motivated by various applications, we present a model that is inspired by both extensions. We derive an algorithm that achieves asymptotically the minimal mean square error (MMSE) in the estimation of Gaussian multidimensional components. We demonstrate the superior performance of this model over a two-step approach, in which JBSS, which ignores the multidimensional structure, is followed by a clustering step
    • …
    corecore