271 research outputs found
Variational Principles for Monotone and Maximal Bifunctions
2000 Mathematics Subject Classification: 49J40, 49J35, 58E30, 47H05We establish variational principles for monotone and maximal
bifunctions of BrĂžndsted-Rockafellar type by using our characterization of
bifunctionâs maximality in reflexive Banach spaces. As applications, we give
an existence result of saddle point for convex-concave function and solve an
approximate inclusion governed by a maximal monotone operator
Combining Strong Convergence, Values Fast Convergence and Vanishing of Gradients for a Proximal Point Algorithm Using Tikhonov Regularization in a Hilbert Space
In a real Hilbert space . Given any function convex
differentiable whose solution set is nonempty, by
considering the Proximal Algorithm x_{k+1}=\text{prox}_{\b_k f}(d x_k), where
and (\b_k) is nondecreasing function, and by assuming some
assumptions on (\b_k), we will show that the value of the objective function
in the sequence generated by our algorithm converges in order to the global minimum of the objective
function, and that the generated sequence converges strongly to the minimum
norm element of , we also obtain a convergence rate
of gradient toward zero. Afterward, we extend these results to non-smooth
convex functions with extended real values
FAST CONVEX OPTIMIZATION VIA A THIRD-ORDER IN TIME EVOLUTION EQUATION
In a Hilbert space H, we develop fast convex optimization methods, which are based on a third order in time evolution system. The function to minimize f : H â R is convex, continuously differentiable, with argmin f = â
, and enters the dynamic via its gradient. On the basis of Lyapunov's analysis and temporal scaling techniques, we show a convergence rate of the values of the order 1/t 3 , and obtain the convergence of the trajectories towards optimal solutions. When f is strongly convex, an exponential rate of convergence is obtained. We complete the study of the continuous dynamic by introducing a damping term induced by the Hessian of f. This allows the oscillations to be controlled and attenuated. Then, we analyze the convergence of the proximal-based algorithms obtained by temporal discretization of this system, and obtain similar convergence rates. The algorithmic results are valid for a general convex, lower semicontinuous, and proper function f : H â R âȘ {+â}
Les effets du feedback explicatif et de la remédiation sur le rendement en mathématiques, en relation avec le degré d'internalité des élÚves, dans le cadre d'une démarche autocorrective
Québec Université Laval, BibliothÚque 201
Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
In a Hilbert setting, for convex differentiable optimization, we consider
accelerated gradient dynamics combining Tikhonov regularization with
Hessian-driven damping. The Tikhonov regularization parameter is assumed to
tend to zero as time tends to infinity, which preserves equilibria. The
presence of the Tikhonov regularization term induces a strong convexity
property which vanishes asymptotically. To take advantage of the exponential
convergence rates attached to the heavy ball method in the strongly convex
case, we consider the inertial dynamic where the viscous damping coefficient is
taken proportional to the square root of the Tikhonov regularization parameter,
and therefore also converges towards zero. Moreover, the dynamic involves a
geometric damping which is driven by the Hessian of the function to be
minimized, which induces a significant attenuation of the oscillations. Under
an appropriate tuning of the parameters, based on Lyapunov's analysis, we show
that the trajectories have at the same time several remarkable properties: they
provide fast convergence of values, fast convergence of gradients towards zero,
and strong convergence to the minimum norm minimizer. This study extends a
previous paper by the authors where similar issues were examined but without
the presence of Hessian driven damping.Comment: 28 pages, 3 figure
Survey of the pelagic fish resources off north west Africa. Part III: Morocco 17 November - 18 Desember 2001
Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
In this paper, we propose in a Hilbertian setting a second-order
time-continuous dynamic system with fast convergence guarantees to solve
structured convex minimization problems with an affine constraint. The system
is associated with the augmented Lagrangian formulation of the minimization
problem. The corresponding dynamics brings into play three general time-varying
parameters, each with specific properties, and which are respectively
associated with viscous damping, extrapolation and temporal scaling. By
appropriately adjusting these parameters, we develop a Lyapunov analysis which
provides fast convergence properties of the values and of the feasibility gap.
These results will naturally pave the way for developing corresponding
accelerated ADMM algorithms, obtained by temporal discretization
- âŠ