10,797 research outputs found
Two-step inertial Bregman proximal alternating linearized minimization algorithm for nonconvex and nonsmooth problems
In this paper, we study an algorithm for solving a class of nonconvex and
nonsmooth nonseparable optimization problems. Based on proximal alternating
linearized minimization (PALM), we propose a new iterative algorithm which
combines two-step inertial extrapolation and Bregman distance. By constructing
appropriate benefit function, with the help of Kurdyka--{\L}ojasiewicz property
we establish the convergence of the whole sequence generated by proposed
algorithm. We apply the algorithm to signal recovery, quadratic fractional
programming problem and show the effectiveness of proposed algorithm.Comment: 28 pages, 8 figures, 4 tables. arXiv admin note: text overlap with
arXiv:2306.0420
Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
In this paper, we present a new stochastic algorithm, namely the stochastic
block mirror descent (SBMD) method for solving large-scale nonsmooth and
stochastic optimization problems. The basic idea of this algorithm is to
incorporate the block-coordinate decomposition and an incremental block
averaging scheme into the classic (stochastic) mirror-descent method, in order
to significantly reduce the cost per iteration of the latter algorithm. We
establish the rate of convergence of the SBMD method along with its associated
large-deviation results for solving general nonsmooth and stochastic
optimization problems. We also introduce different variants of this method and
establish their rate of convergence for solving strongly convex, smooth, and
composite optimization problems, as well as certain nonconvex optimization
problems. To the best of our knowledge, all these developments related to the
SBMD methods are new in the stochastic optimization literature. Moreover, some
of our results also seem to be new for block coordinate descent methods for
deterministic optimization
- …