597 research outputs found
Zero-Convex Functions, Perturbation Resilience, and Subgradient Projections for Feasibility-Seeking Methods
The convex feasibility problem (CFP) is at the core of the modeling of many
problems in various areas of science. Subgradient projection methods are
important tools for solving the CFP because they enable the use of subgradient
calculations instead of orthogonal projections onto the individual sets of the
problem. Working in a real Hilbert space, we show that the sequential
subgradient projection method is perturbation resilient. By this we mean that
under appropriate conditions the sequence generated by the method converges
weakly, and sometimes also strongly, to a point in the intersection of the
given subsets of the feasibility problem, despite certain perturbations which
are allowed in each iterative step. Unlike previous works on solving the convex
feasibility problem, the involved functions, which induce the feasibility
problem's subsets, need not be convex. Instead, we allow them to belong to a
wider and richer class of functions satisfying a weaker condition, that we call
"zero-convexity". This class, which is introduced and discussed here, holds a
promise to solve optimization problems in various areas, especially in
non-smooth and non-convex optimization. The relevance of this study to
approximate minimization and to the recent superiorization methodology for
constrained optimization is explained.Comment: Mathematical Programming Series A, accepted for publicatio
On global solvability of a class of possibly nonconvex QCQP problems in Hilbert spaces
We provide conditions ensuring that the KKT-type conditions characterizes the
global optimality for
quadratically constrained (possibly nonconvex) quadratic programming QCQP
problems in Hilbert spaces. The key property is the convexity of a image-type
set related to the functions appearing in the formulation of the problem.
The proof of the main result relies on a generalized version of the
(Jakubovich) S-Lemma in Hilbert spaces.
As an application, we consider the class of QCQP problems with a special form
of the quadratic terms of the constraints.Comment: arXiv admin note: text overlap with arXiv:2206.0061
An Approximate Shapley-Folkman Theorem
The Shapley-Folkman theorem shows that Minkowski averages of uniformly
bounded sets tend to be convex when the number of terms in the sum becomes much
larger than the ambient dimension. In optimization, Aubin and Ekeland [1976]
show that this produces an a priori bound on the duality gap of separable
nonconvex optimization problems involving finite sums. This bound is highly
conservative and depends on unstable quantities, and we relax it in several
directions to show that non convexity can have a much milder impact on finite
sum minimization problems such as empirical risk minimization and multi-task
classification. As a byproduct, we show a new version of Maurey's classical
approximate Carath\'eodory lemma where we sample a significant fraction of the
coefficients, without replacement, as well as a result on sampling constraints
using an approximate Helly theorem, both of independent interest.Comment: Added constraint sampling result, simplified sampling results,
reformat, et
Gradient flows as a selection procedure for equilibria of nonconvex energies
For atomistic material models, global minimization gives the wrong qualitative behavior; a theory of equilibrium solutions needs to be defined in different terms. In this paper, a concept based on gradient flow evolutions, to describe local minimization for simple atomistic models based on the LennardâJones potential, is presented. As an application of this technique, it is shown that an atomistic gradient flow evolution converges to a gradient flow of a continuum energy as the spacing between the atoms tends to zero. In addition, the convergence of the resulting equilibria is investigated in the case of elastic deformation and a simple damaged state
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
- âŠ