805 research outputs found
Adaptive Smoothing Algorithms for Nonsmooth Composite Convex Minimization
We propose an adaptive smoothing algorithm based on Nesterov's smoothing
technique in \cite{Nesterov2005c} for solving "fully" nonsmooth composite
convex optimization problems. Our method combines both Nesterov's accelerated
proximal gradient scheme and a new homotopy strategy for smoothness parameter.
By an appropriate choice of smoothing functions, we develop a new algorithm
that has the -worst-case
iteration-complexity while preserves the same complexity-per-iteration as in
Nesterov's method and allows one to automatically update the smoothness
parameter at each iteration. Then, we customize our algorithm to solve four
special cases that cover various applications. We also specify our algorithm to
solve constrained convex optimization problems and show its convergence
guarantee on a primal sequence of iterates. We demonstrate our algorithm
through three numerical examples and compare it with other related algorithms.Comment: This paper has 23 pages, 3 figures and 1 tabl
Sequential Convex Programming Methods for Solving Nonlinear Optimization Problems with DC constraints
This paper investigates the relation between sequential convex programming
(SCP) as, e.g., defined in [24] and DC (difference of two convex functions)
programming. We first present an SCP algorithm for solving nonlinear
optimization problems with DC constraints and prove its convergence. Then we
combine the proposed algorithm with a relaxation technique to handle
inconsistent linearizations. Numerical tests are performed to investigate the
behaviour of the class of algorithms.Comment: 18 pages, 1 figur
A Primal-Dual Algorithmic Framework for Constrained Convex Minimization
We present a primal-dual algorithmic framework to obtain approximate
solutions to a prototypical constrained convex optimization problem, and
rigorously characterize how common structural assumptions affect the numerical
efficiency. Our main analysis technique provides a fresh perspective on
Nesterov's excessive gap technique in a structured fashion and unifies it with
smoothing and primal-dual methods. For instance, through the choices of a dual
smoothing strategy and a center point, our framework subsumes decomposition
algorithms, augmented Lagrangian as well as the alternating direction
method-of-multipliers methods as its special cases, and provides optimal
convergence rates on the primal objective residual as well as the primal
feasibility gap of the iterates for all.Comment: This paper consists of 54 pages with 7 tables and 12 figure
- …