307 research outputs found

    Blind Source Separation with Compressively Sensed Linear Mixtures

    Full text link
    This work studies the problem of simultaneously separating and reconstructing signals from compressively sensed linear mixtures. We assume that all source signals share a common sparse representation basis. The approach combines classical Compressive Sensing (CS) theory with a linear mixing model. It allows the mixtures to be sampled independently of each other. If samples are acquired in the time domain, this means that the sensors need not be synchronized. Since Blind Source Separation (BSS) from a linear mixture is only possible up to permutation and scaling, factoring out these ambiguities leads to a minimization problem on the so-called oblique manifold. We develop a geometric conjugate subgradient method that scales to large systems for solving the problem. Numerical results demonstrate the promising performance of the proposed algorithm compared to several state of the art methods.Comment: 9 pages, 2 figure

    Riemannian Smoothing Gradient Type Algorithms]{Riemannian Smoothing Gradient Type Algorithms for Nonsmooth Optimization Problem on Compact Riemannian Submanifold Embedded in Euclidean Space

    Full text link
    In this paper, we introduce the notion of generalized ϵ\epsilon-stationarity for a class of nonconvex and nonsmooth composite minimization problems on compact Riemannian submanifold embedded in Euclidean space. To find a generalized ϵ\epsilon-stationarity point, we develop a family of Riemannian gradient-type methods based on the Moreau envelope technique with a decreasing sequence of smoothing parameters, namely Riemannian smoothing gradient and Riemannian smoothing stochastic gradient methods. We prove that the Riemannian smoothing gradient method has the iteration complexity of O(ϵ3)\mathcal{O}(\epsilon^{-3}) for driving a generalized ϵ\epsilon-stationary point. To our knowledge, this is the best-known iteration complexity result for the nonconvex and nonsmooth composite problem on manifolds. For the Riemannian smoothing stochastic gradient method, one can achieve the iteration complexity of O(ϵ5)\mathcal{O}(\epsilon^{-5}) for driving a generalized ϵ\epsilon-stationary point. Numerical experiments are conducted to validate the superiority of our algorithms

    Smoothing algorithms for nonsmooth and nonconvex minimization over the stiefel manifold

    Full text link
    We consider a class of nonsmooth and nonconvex optimization problems over the Stiefel manifold where the objective function is the summation of a nonconvex smooth function and a nonsmooth Lipschitz continuous convex function composed with an linear mapping. We propose three numerical algorithms for solving this problem, by combining smoothing methods and some existing algorithms for smooth optimization over the Stiefel manifold. In particular, we approximate the aforementioned nonsmooth convex function by its Moreau envelope in our smoothing methods, and prove that the Moreau envelope has many favorable properties. Thanks to this and the scheme for updating the smoothing parameter, we show that any accumulation point of the solution sequence generated by the proposed algorithms is a stationary point of the original optimization problem. Numerical experiments on building graph Fourier basis are conducted to demonstrate the efficiency of the proposed algorithms.Comment: 22 page
    corecore