9,366 research outputs found
A Coordinate Descent Approach to Atomic Norm Minimization
Atomic norm minimization is of great interest in various applications of
sparse signal processing including super-resolution line-spectral estimation
and signal denoising. In practice, atomic norm minimization (ANM) is formulated
as a semi-definite programming (SDP) which is generally hard to solve. This
work introduces a low-complexity, matrix-free method for solving ANM. The
method uses the framework of coordinate descent and exploits the
sparsity-induced nature of atomic-norm regularization. Specifically, an
equivalent, non-convex formulation of ANM is first proposed. It is then proved
that applying the coordinate descent framework on the non-convex formulation
leads to convergence to the global optimal point. For the case of a single
measurement vector of length N in discrete fourier transform (DFT) basis, the
complexity of each iteration in the coordinate descent procedure is O(N log N
), rendering the proposed method efficient even for large-scale problems. The
proposed coordinate descent framework can be readily modified to solve a
variety of ANM problems, including multi-dimensional ANM with multiple
measurement vectors. It is easy to implement and can essentially be applied to
any atomic sets as long as a corresponding rank-1 problem can be solved.
Through extensive numerical simulations, it is verified that for solving sparse
problems the proposed method is much faster than the alternating direction
method of multipliers (ADMM) or the customized interior point SDP solver
An Extragradient-Based Alternating Direction Method for Convex Minimization
In this paper, we consider the problem of minimizing the sum of two convex
functions subject to linear linking constraints. The classical alternating
direction type methods usually assume that the two convex functions have
relatively easy proximal mappings. However, many problems arising from
statistics, image processing and other fields have the structure that while one
of the two functions has easy proximal mapping, the other function is smoothly
convex but does not have an easy proximal mapping. Therefore, the classical
alternating direction methods cannot be applied. To deal with the difficulty,
we propose in this paper an alternating direction method based on
extragradients. Under the assumption that the smooth function has a Lipschitz
continuous gradient, we prove that the proposed method returns an
-optimal solution within iterations. We apply the
proposed method to solve a new statistical model called fused logistic
regression. Our numerical experiments show that the proposed method performs
very well when solving the test problems. We also test the performance of the
proposed method through solving the lasso problem arising from statistics and
compare the result with several existing efficient solvers for this problem;
the results are very encouraging indeed
- …