196 research outputs found
Projections Onto Convex Sets (POCS) Based Optimization by Lifting
Two new optimization techniques based on projections onto convex space (POCS)
framework for solving convex and some non-convex optimization problems are
presented. The dimension of the minimization problem is lifted by one and sets
corresponding to the cost function are defined. If the cost function is a
convex function in R^N the corresponding set is a convex set in R^(N+1). The
iterative optimization approach starts with an arbitrary initial estimate in
R^(N+1) and an orthogonal projection is performed onto one of the sets in a
sequential manner at each step of the optimization problem. The method provides
globally optimal solutions in total-variation, filtered variation, l1, and
entropic cost functions. It is also experimentally observed that cost functions
based on lp, p<1 can be handled by using the supporting hyperplane concept
A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
We introduce Bella, a locally superlinearly convergent Bregman forward
backward splitting method for minimizing the sum of two nonconvex functions,
one of which satisfying a relative smoothness condition and the other one
possibly nonsmooth. A key tool of our methodology is the Bregman
forward-backward envelope (BFBE), an exact and continuous penalty function with
favorable first- and second-order properties, and enjoying a nonlinear error
bound when the objective function satisfies a Lojasiewicz-type property. The
proposed algorithm is of linesearch type over the BFBE along candidate update
directions, and converges subsequentially to stationary points, globally under
a KL condition, and owing to the given nonlinear error bound can attain
superlinear convergence rates even when the limit point is a nonisolated
minimum, provided the directions are suitably selected
On the nonexpansive operators based on arbitrary metric: A degenerate analysis
We in this paper study the nonexpansive operators equipped with arbitrary
metric and investigate the connections between firm nonexpansiveness,
cocoerciveness and averagedness. The convergence of the associated fixed-point
iterations is discussed with particular focus on the case of degenerate metric,
since the degeneracy is often encountered when reformulating many existing
first-order operator splitting algorithms as a metric resolvent. This work
paves a way for analyzing the generalized proximal point algorithm with a
non-trivial relaxation step and degenerate metric.Comment: accepted by Results in Mathematic
- …