3,394 research outputs found
Conjugate gradient acceleration of iteratively re-weighted least squares methods
Iteratively Re-weighted Least Squares (IRLS) is a method for solving
minimization problems involving non-quadratic cost functions, perhaps
non-convex and non-smooth, which however can be described as the infimum over a
family of quadratic functions. This transformation suggests an algorithmic
scheme that solves a sequence of quadratic problems to be tackled efficiently
by tools of numerical linear algebra. Its general scope and its usually simple
implementation, transforming the initial non-convex and non-smooth minimization
problem into a more familiar and easily solvable quadratic optimization
problem, make it a versatile algorithm. However, despite its simplicity,
versatility, and elegant analysis, the complexity of IRLS strongly depends on
the way the solution of the successive quadratic optimizations is addressed.
For the important special case of and sparse
recovery problems in signal processing, we investigate theoretically and
numerically how accurately one needs to solve the quadratic problems by means
of the (CG) method in each iteration in order to
guarantee convergence. The use of the CG method may significantly speed-up the
numerical solution of the quadratic subproblems, in particular, when fast
matrix-vector multiplication (exploiting for instance the FFT) is available for
the matrix involved. In addition, we study convergence rates. Our modified IRLS
method outperforms state of the art first order methods such as Iterative Hard
Thresholding (IHT) or Fast Iterative Soft-Thresholding Algorithm (FISTA) in
many situations, especially in large dimensions. Moreover, IRLS is often able
to recover sparse vectors from fewer measurements than required for IHT and
FISTA.Comment: 40 page
Expectation-maximization for logistic regression
We present a family of expectation-maximization (EM) algorithms for binary
and negative-binomial logistic regression, drawing a sharp connection with the
variational-Bayes algorithm of Jaakkola and Jordan (2000). Indeed, our results
allow a version of this variational-Bayes approach to be re-interpreted as a
true EM algorithm. We study several interesting features of the algorithm, and
of this previously unrecognized connection with variational Bayes. We also
generalize the approach to sparsity-promoting priors, and to an online method
whose convergence properties are easily established. This latter method
compares favorably with stochastic-gradient descent in situations with marked
collinearity
PhasePack: A Phase Retrieval Library
Phase retrieval deals with the estimation of complex-valued signals solely
from the magnitudes of linear measurements. While there has been a recent
explosion in the development of phase retrieval algorithms, the lack of a
common interface has made it difficult to compare new methods against the
state-of-the-art. The purpose of PhasePack is to create a common software
interface for a wide range of phase retrieval algorithms and to provide a
common testbed using both synthetic data and empirical imaging datasets.
PhasePack is able to benchmark a large number of recent phase retrieval methods
against one another to generate comparisons using a range of different
performance metrics. The software package handles single method testing as well
as multiple method comparisons.
The algorithm implementations in PhasePack differ slightly from their
original descriptions in the literature in order to achieve faster speed and
improved robustness. In particular, PhasePack uses adaptive stepsizes,
line-search methods, and fast eigensolvers to speed up and automate
convergence
- …