577 research outputs found
A multilevel approach for nonnegative matrix factorization
Nonnegative Matrix Factorization (NMF) is the problem of approximating a nonnegative matrix with the product of two low-rank nonnegative matrices and has been shown to be particularly useful in many applications, e.g., in text mining, image processing, computational biology, etc. In this paper, we explain how algorithms for NMF can be embedded into the framework of multi- level methods in order to accelerate their convergence. This technique can be applied in situations where data admit a good approximate representation in a lower dimensional space through linear transformations preserving nonnegativity. A simple multilevel strategy is described and is experi- mentally shown to speed up significantly three popular NMF algorithms (alternating nonnegative least squares, multiplicative updates and hierarchical alternating least squares) on several standard image datasets.nonnegative matrix factorization, algorithms, multigrid and multilevel methods, image processing
ebnm: An R Package for Solving the Empirical Bayes Normal Means Problem Using a Variety of Prior Families
The empirical Bayes normal means (EBNM) model is important to many areas of
statistics, including (but not limited to) multiple testing, wavelet denoising,
multiple linear regression, and matrix factorization. There are several
existing software packages that can fit EBNM models under different prior
assumptions and using different algorithms; however, the differences across
interfaces complicate direct comparisons. Further, a number of important prior
assumptions do not yet have implementations. Motivated by these issues, we
developed the R package ebnm, which provides a unified interface for
efficiently fitting EBNM models using a variety of prior assumptions, including
nonparametric approaches. In some cases, we incorporated existing
implementations into ebnm; in others, we implemented new fitting procedures
with a focus on speed and numerical stability. To demonstrate the capabilities
of the unified interface, we compare results using different prior assumptions
in two extended examples: the shrinkage estimation of baseball statistics; and
the matrix factorization of genetics data (via the new R package flashier). In
summary, ebnm is a convenient and comprehensive package for performing EBNM
analyses under a wide range of prior assumptions.Comment: 43 pages, 19 figure
Multiplicative Updates for Nonnegative Quadratic Programming
Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the objective function at each iteration and converge monotonically to the global minimum. The updates have a simple closed form and do not involve any heuristics or free parameters that must be tuned to ensure convergence. Despite their simplicity, they differ strikingly in form from other multiplicative updates used in machine learning.We provide complete proofs of convergence for these updates and describe their application to problems in signal processing and pattern recognition
- …