100 research outputs found
Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization
We propose a new randomized coordinate descent method for a convex
optimization template with broad applications. Our analysis relies on a novel
combination of four ideas applied to the primal-dual gap function: smoothing,
acceleration, homotopy, and coordinate descent with non-uniform sampling. As a
result, our method features the first convergence rate guarantees among the
coordinate descent methods, that are the best-known under a variety of common
structure assumptions on the template. We provide numerical evidence to support
the theoretical results with a comparison to state-of-the-art algorithms.Comment: NIPS 201
Optimization with Sparsity-Inducing Penalties
Sparse estimation methods are aimed at using or obtaining parsimonious
representations of data or models. They were first dedicated to linear variable
selection but numerous extensions have now emerged such as structured sparsity
or kernel selection. It turns out that many of the related estimation problems
can be cast as convex optimization problems by regularizing the empirical risk
with appropriate non-smooth norms. The goal of this paper is to present from a
general perspective optimization tools and techniques dedicated to such
sparsity-inducing penalties. We cover proximal methods, block-coordinate
descent, reweighted -penalized techniques, working-set and homotopy
methods, as well as non-convex formulations and extensions, and provide an
extensive set of experiments to compare various algorithms from a computational
point of view
A generic coordinate descent solver for nonsmooth convex optimization
International audienceWe present a generic coordinate descent solver for the minimization of a nonsmooth convex objective with structure. The method can deal in particular with problems with linear constraints. The implementation makes use of efficient residual updates and automatically determines which dual variables should be duplicated. A list of basic functional atoms is pre-compiled for efficiency and a modelling language in Python allows the user to combine them at run time. So, the algorithm can be used to solve a large variety of problems including Lasso, sparse multinomial logistic regression, linear and quadratic programs
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
Hybrid Methods in Polynomial Optimisation
The Moment/Sum-of-squares hierarchy provides a way to compute the global
minimizers of polynomial optimization problems (POP), at the cost of solving a
sequence of increasingly large semidefinite programs (SDPs). We consider
large-scale POPs, for which interior-point methods are no longer able to solve
the resulting SDPs. We propose an algorithm that combines a first-order
Burer-Monteiro-type method for solving the SDP relaxation, and a second-order
method on a non-convex problem obtained from the POP. The switch from the first
to the second-order method is based on a quantitative criterion, whose
satisfaction ensures that Newton's method converges quadratically from its
first iteration. This criterion leverages the point-estimation theory of Smale
and the active-set identification. We illustrate the methodology to obtain
global minimizers of large-scale optimal power flow problems
- …