762 research outputs found

    A Nonconvex Splitting Method for Symmetric Nonnegative Matrix Factorization: Convergence Analysis and Optimality

    Get PDF
    Symmetric nonnegative matrix factorization (SymNMF) has important applications in data analytics problems such as document clustering, community detection and image segmentation. In this paper, we propose a novel nonconvex variable splitting method for solving SymNMF. The proposed algorithm is guaranteed to converge to the set of Karush-Kuhn-Tucker (KKT) points of the nonconvex SymNMF problem. Furthermore, it achieves a global sublinear convergence rate. We also show that the algorithm can be efficiently implemented in parallel. Further, sufficient conditions are provided which guarantee the global and local optimality of the obtained solutions. Extensive numerical results performed on both synthetic and real data sets suggest that the proposed algorithm converges quickly to a local minimum solution.Comment: IEEE Transactions on Signal Processing (to appear

    Algorithms for Positive Semidefinite Factorization

    Full text link
    This paper considers the problem of positive semidefinite factorization (PSD factorization), a generalization of exact nonnegative matrix factorization. Given an mm-by-nn nonnegative matrix XX and an integer kk, the PSD factorization problem consists in finding, if possible, symmetric kk-by-kk positive semidefinite matrices {A1,...,Am}\{A^1,...,A^m\} and {B1,...,Bn}\{B^1,...,B^n\} such that Xi,j=trace(AiBj)X_{i,j}=\text{trace}(A^iB^j) for i=1,...,mi=1,...,m, and j=1,...nj=1,...n. PSD factorization is NP-hard. In this work, we introduce several local optimization schemes to tackle this problem: a fast projected gradient method and two algorithms based on the coordinate descent framework. The main application of PSD factorization is the computation of semidefinite extensions, that is, the representations of polyhedrons as projections of spectrahedra, for which the matrix to be factorized is the slack matrix of the polyhedron. We compare the performance of our algorithms on this class of problems. In particular, we compute the PSD extensions of size k=1+⌈log⁡2(n)⌉k=1+ \lceil \log_2(n) \rceil for the regular nn-gons when n=5n=5, 88 and 1010. We also show how to generalize our algorithms to compute the square root rank (which is the size of the factors in a PSD factorization where all factor matrices AiA^i and BjB^j have rank one) and completely PSD factorizations (which is the special case where the input matrix is symmetric and equality Ai=BiA^i=B^i is required for all ii).Comment: 21 pages, 3 figures, 3 table

    Two Algorithms for Orthogonal Nonnegative Matrix Factorization with Application to Clustering

    Full text link
    Approximate matrix factorization techniques with both nonnegativity and orthogonality constraints, referred to as orthogonal nonnegative matrix factorization (ONMF), have been recently introduced and shown to work remarkably well for clustering tasks such as document classification. In this paper, we introduce two new methods to solve ONMF. First, we show athematical equivalence between ONMF and a weighted variant of spherical k-means, from which we derive our first method, a simple EM-like algorithm. This also allows us to determine when ONMF should be preferred to k-means and spherical k-means. Our second method is based on an augmented Lagrangian approach. Standard ONMF algorithms typically enforce nonnegativity for their iterates while trying to achieve orthogonality at the limit (e.g., using a proper penalization term or a suitably chosen search direction). Our method works the opposite way: orthogonality is strictly imposed at each step while nonnegativity is asymptotically obtained, using a quadratic penalty. Finally, we show that the two proposed approaches compare favorably with standard ONMF algorithms on synthetic, text and image data sets.Comment: 17 pages, 8 figures. New numerical experiments (document and synthetic data sets

    A multilevel approach for nonnegative matrix factorization

    Get PDF
    Nonnegative Matrix Factorization (NMF) is the problem of approximating a nonnegative matrix with the product of two low-rank nonnegative matrices and has been shown to be particularly useful in many applications, e.g., in text mining, image processing, computational biology, etc. In this paper, we explain how algorithms for NMF can be embedded into the framework of multi- level methods in order to accelerate their convergence. This technique can be applied in situations where data admit a good approximate representation in a lower dimensional space through linear transformations preserving nonnegativity. A simple multilevel strategy is described and is experi- mentally shown to speed up significantly three popular NMF algorithms (alternating nonnegative least squares, multiplicative updates and hierarchical alternating least squares) on several standard image datasets.nonnegative matrix factorization, algorithms, multigrid and multilevel methods, image processing

    Using Underapproximations for Sparse Nonnegative Matrix Factorization

    Full text link
    Nonnegative Matrix Factorization consists in (approximately) factorizing a nonnegative data matrix by the product of two low-rank nonnegative matrices. It has been successfully applied as a data analysis technique in numerous domains, e.g., text mining, image processing, microarray data analysis, collaborative filtering, etc. We introduce a novel approach to solve NMF problems, based on the use of an underapproximation technique, and show its effectiveness to obtain sparse solutions. This approach, based on Lagrangian relaxation, allows the resolution of NMF problems in a recursive fashion. We also prove that the underapproximation problem is NP-hard for any fixed factorization rank, using a reduction of the maximum edge biclique problem in bipartite graphs. We test two variants of our underapproximation approach on several standard image datasets and show that they provide sparse part-based representations with low reconstruction error. Our results are comparable and sometimes superior to those obtained by two standard Sparse Nonnegative Matrix Factorization techniques.Comment: Version 2 removed the section about convex reformulations, which was not central to the development of our main results; added material to the introduction; added a review of previous related work (section 2.3); completely rewritten the last part (section 4) to provide extensive numerical results supporting our claims. Accepted in J. of Pattern Recognitio
    • 

    corecore