24 research outputs found

    Robust Iterative Fitting of Multilinear Models

    Get PDF

    Joint Tensor Factorization and Outlying Slab Suppression with Applications

    Full text link
    We consider factoring low-rank tensors in the presence of outlying slabs. This problem is important in practice, because data collected in many real-world applications, such as speech, fluorescence, and some social network data, fit this paradigm. Prior work tackles this problem by iteratively selecting a fixed number of slabs and fitting, a procedure which may not converge. We formulate this problem from a group-sparsity promoting point of view, and propose an alternating optimization framework to handle the corresponding ℓp\ell_p (0<p≤10<p\leq 1) minimization-based low-rank tensor factorization problem. The proposed algorithm features a similar per-iteration complexity as the plain trilinear alternating least squares (TALS) algorithm. Convergence of the proposed algorithm is also easy to analyze under the framework of alternating optimization and its variants. In addition, regularization and constraints can be easily incorporated to make use of \emph{a priori} information on the latent loading factors. Simulations and real data experiments on blind speech separation, fluorescence data analysis, and social network mining are used to showcase the effectiveness of the proposed algorithm

    A distributionally robust linear receiver design for multi-access space-time block coded MIMO systems

    Get PDF
    A receiver design problem for multi-access space-time block coded multiple-input multiple-output systems is considered. To hedge the mismatch between the true and the estimated channel state information (CSI), several robust receivers have been developed in the past decades. Among these receivers, the Gaussian robust receiver has been shown to be superior in performance. This receiver is designed based on the assumption that the CSI mismatch has Gaussian distribution. However, in real-world applications, the assumption of Guassianity might not hold. Motivated by this fact, a more general distributionally robust receiver is proposed in this paper, where only the mean and the variance of the CSI mismatch distribution are required in the receiver design. A tractable semi-definite programming (SDP) reformulation of the robust receiver design is developed. To suppress the self-interferences, a more advanced distributionally robust receiver is proposed. A tight convex approximation is given and the corresponding tractable SDP reformulation is developed. Moreover, for the sake of easy implementation, we present a simplified distributionally robust receiver. Simulations results are provided to show the effectiveness of our design by comparing with some existing well-known receivers

    Tensor Decompositions for Signal Processing Applications From Two-way to Multiway Component Analysis

    Full text link
    The widespread use of multi-sensor technology and the emergence of big datasets has highlighted the limitations of standard flat-view matrix models and the necessity to move towards more versatile data analysis tools. We show that higher-order tensors (i.e., multiway arrays) enable such a fundamental paradigm shift towards models that are essentially polynomial and whose uniqueness, unlike the matrix methods, is guaranteed under verymild and natural conditions. Benefiting fromthe power ofmultilinear algebra as theirmathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints that match data properties, and to find more general latent components in the data than matrix-based methods. A comprehensive introduction to tensor decompositions is provided from a signal processing perspective, starting from the algebraic foundations, via basic Canonical Polyadic and Tucker models, through to advanced cause-effect and multi-view data analysis schemes. We show that tensor decompositions enable natural generalizations of some commonly used signal processing paradigms, such as canonical correlation and subspace techniques, signal separation, linear regression, feature extraction and classification. We also cover computational aspects, and point out how ideas from compressed sensing and scientific computing may be used for addressing the otherwise unmanageable storage and manipulation problems associated with big datasets. The concepts are supported by illustrative real world case studies illuminating the benefits of the tensor framework, as efficient and promising tools for modern signal processing, data analysis and machine learning applications; these benefits also extend to vector/matrix data through tensorization. Keywords: ICA, NMF, CPD, Tucker decomposition, HOSVD, tensor networks, Tensor Train
    corecore