8 research outputs found
A Fast Interior Point Method for Atomic Norm Soft Thresholding
The atomic norm provides a generalization of the -norm to continuous
parameter spaces. When applied as a sparse regularizer for line spectral
estimation the solution can be obtained by solving a convex optimization
problem. This problem is known as atomic norm soft thresholding (AST). It can
be cast as a semidefinite program and solved by standard methods. In the
semidefinite formulation there are dual variables which complicates
the implementation of a standard primal-dual interior-point method based on
symmetric cones. That has lead researcher to consider alternating direction
method of multipliers (ADMM) for the solution of AST, but this method is still
somewhat slow for large problem sizes. To obtain a faster algorithm we
reformulate AST as a non-symmetric conic program. That has two properties of
key importance to its numerical solution: the conic formulation has only
dual variables and the Toeplitz structure inherent to AST is preserved. Based
on it we derive FastAST which is a primal-dual interior point method for
solving AST. Two variants are considered with the fastest one requiring only
flops per iteration. Extensive numerical experiments demonstrate that
FastAST solves AST significantly faster than a state-of-the-art solver based on
ADMM.Comment: 31 pages, accepted for publication in Elsevier Signal Processin
Interior-point algorithms for convex optimization based on primal-dual metrics
We propose and analyse primal-dual interior-point algorithms for convex
optimization problems in conic form. The families of algorithms we analyse are
so-called short-step algorithms and they match the current best iteration
complexity bounds for primal-dual symmetric interior-point algorithm of
Nesterov and Todd, for symmetric cone programming problems with given
self-scaled barriers. Our results apply to any self-concordant barrier for any
convex cone. We also prove that certain specializations of our algorithms to
hyperbolic cone programming problems (which lie strictly between symmetric cone
programming and general convex optimization problems in terms of generality)
can take advantage of the favourable special structure of hyperbolic barriers.
We make new connections to Riemannian geometry, integrals over operator spaces,
Gaussian quadrature, and strengthen the connection of our algorithms to
quasi-Newton updates and hence first-order methods in general.Comment: 36 page
Generalization Of Primal-Dual Interior-Point Methods To Convex Optimization Problems In Conic Form
We generalize primal-dual interior-point methods for linear programming problems to the convex optimization problems in conic form. Previously, the most comprehensive theory of symmetric primal-dual interior-point algorithms was given by Nesterov and Todd [8, 9] for the feasible regions expressed as the intersection of a symmetric cone with an affine subspace. In our setting, we allow an arbitrary convex cone in place of the symmetric cone. Even though some of the impressive properties attained by Nesterov-Todd algorithms is impossible in this general setting of convex optimization problems, we show that essentially all primal-dual interior-pointalgorithms for LP can be extended easily to the general setting. We provide three frameworks for primal-dual algorithms, each framework corresponding to a different level of sophistication in the algorithms. As the level of sophistication increases, we demand better formulations of the feasible solution sets, but our algorithms, in return, atta..