277 research outputs found
Semi-proximal Mirror-Prox for Nonsmooth Composite Minimization
We propose a new first-order optimisation algorithm to solve high-dimensional
non-smooth composite minimisation problems. Typical examples of such problems
have an objective that decomposes into a non-smooth empirical risk part and a
non-smooth regularisation penalty. The proposed algorithm, called Semi-Proximal
Mirror-Prox, leverages the Fenchel-type representation of one part of the
objective while handling the other part of the objective via linear
minimization over the domain. The algorithm stands in contrast with more
classical proximal gradient algorithms with smoothing, which require the
computation of proximal operators at each iteration and can therefore be
impractical for high-dimensional problems. We establish the theoretical
convergence rate of Semi-Proximal Mirror-Prox, which exhibits the optimal
complexity bounds, i.e. , for the number of calls to linear
minimization oracle. We present promising experimental results showing the
interest of the approach in comparison to competing methods
Inexact Model: A Framework for Optimization and Variational Inequalities
In this paper we propose a general algorithmic framework for first-order
methods in optimization in a broad sense, including minimization problems,
saddle-point problems and variational inequalities. This framework allows to
obtain many known methods as a special case, the list including accelerated
gradient method, composite optimization methods, level-set methods, proximal
methods. The idea of the framework is based on constructing an inexact model of
the main problem component, i.e. objective function in optimization or operator
in variational inequalities. Besides reproducing known results, our framework
allows to construct new methods, which we illustrate by constructing a
universal method for variational inequalities with composite structure. This
method works for smooth and non-smooth problems with optimal complexity without
a priori knowledge of the problem smoothness. We also generalize our framework
for strongly convex objectives and strongly monotone variational inequalities.Comment: 41 page
Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, Bregman proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal conditional gradient method and universal method for variational inequalities with composite structure. These method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. As a particular case of our general framework, we introduce relative smoothness for operators and propose an algorithm for VIs with such operator. We also generalize our framework for relatively strongly convex objectives and strongly monotone variational inequalities
Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, Bregman proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal conditional gradient method and universal method for variational inequalities with composite structure. These method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. As a particular case of our general framework, we introduce relative smoothness for operators and propose an algorithm for VIs with such operator. We also generalize our framework for relatively strongly convex objectives and strongly monotone variational inequalities
An Inexact Primal-Dual Smoothing Framework for Large-Scale Non-Bilinear Saddle Point Problems
We develop an inexact primal-dual first-order smoothing framework to solve a
class of non-bilinear saddle point problems with primal strong convexity.
Compared with existing methods, our framework yields a significant improvement
over the primal oracle complexity, while it has competitive dual oracle
complexity. In addition, we consider the situation where the primal-dual
coupling term has a large number of component functions. To efficiently handle
this situation, we develop a randomized version of our smoothing framework,
which allows the primal and dual sub-problems in each iteration to be solved by
randomized algorithms inexactly in expectation. The convergence of this
framework is analyzed both in expectation and with high probability. In terms
of the primal and dual oracle complexities, this framework significantly
improves over its deterministic counterpart. As an important application, we
adapt both frameworks for solving convex optimization problems with many
functional constraints. To obtain an -optimal and
-feasible solution, both frameworks achieve the best-known oracle
complexities (in terms of their dependence on )
- …