758 research outputs found
Gradient methods for problems with inexact model of the objective
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [19] and relative smoothness condition [43]. We analyze gradient method which uses this inexact model and obtain convergence rates for convex and strongly convex problems. To show potential applications of our general framework we consider three particular problems. The first one is clustering by electorial model introduced in [49]. The second one is approximating optimal transport distance, for which we propose a Proximal Sinkhorn algorithm. The third one is devoted to approximating optimal transport barycenter and we propose a Proximal Iterative Bregman Projections algorithm. We also illustrate the practical performance of our algorithms by numerical experiments
Gradient methods for problems with inexact model of the objective
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [19] and relative smoothness condition [43]. We analyze gradient method which uses this inexact model and obtain convergence rates for convex and strongly convex problems. To show potential applications of our general framework we consider three particular problems. The first one is clustering by electorial model introduced in [49]. The second one is approximating optimal transport distance, for which we propose a Proximal Sinkhorn algorithm. The third one is devoted to approximating optimal transport barycenter and we propose a Proximal Iterative Bregman Projections algorithm. We also illustrate the practical performance of our algorithms by numerical experiments
Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems
The proximal gradient algorithm has been popularly used for convex
optimization. Recently, it has also been extended for nonconvex problems, and
the current state-of-the-art is the nonmonotone accelerated proximal gradient
algorithm. However, it typically requires two exact proximal steps in each
iteration, and can be inefficient when the proximal step is expensive. In this
paper, we propose an efficient proximal gradient algorithm that requires only
one inexact (and thus less expensive) proximal step in each iteration.
Convergence to a critical point %of the nonconvex problem is still guaranteed
and has a convergence rate, which is the best rate for nonconvex
problems with first-order methods. Experiments on a number of problems
demonstrate that the proposed algorithm has comparable performance as the
state-of-the-art, but is much faster
Accelerated Inexact Composite Gradient Methods for Nonconvex Spectral Optimization Problems
This paper presents two inexact composite gradient methods, one inner
accelerated and another doubly accelerated, for solving a class of nonconvex
spectral composite optimization problems. More specifically, the objective
function for these problems is of the form where and
are differentiable nonconvex matrix functions with Lipschitz continuous
gradients, is a proper closed convex matrix function, and both and
can be expressed as functions that operate on the singular values of their
inputs. The methods essentially use an accelerated composite gradient method to
solve a sequence of proximal subproblems involving the linear approximation of
and the singular value functions underlying and . Unlike other
composite gradient-based methods, the proposed methods take advantage of both
the composite and spectral structure underlying the objective function in order
to efficiently generate their solutions. Numerical experiments are presented to
demonstrate the practicality of these methods on a set of real-world and
randomly generated spectral optimization problems
- …