176 research outputs found
Variational Analysis In Second-Order Cone Programming And Applications
This dissertation conducts a second-order variational analysis for an important class on nonpolyhedral conic programs generated by the so-called second-order/Lorentz/ice-cream cone. These second-order cone programs (SOCPs) are mathematically challenging due to the nonpolyhedrality of the underlying second-order cone while being important for various applications. The two main devices in our study are second epi-derivative and graphical derivative of the normal cone mapping which are proved to accumulate vital second-order information of functions/constraint systems under investigation. Our main contribution is threefold:
- proving the twice epi-differentiability of the indicator function of the second-order cone and of the augmented Lagrangian associated with SOCPs, and deriving explicit formulae for the calculation of the second epi-derivatives of both functions;
- establishing a precise formula-entirely via the initial data-for calculating the graphical derivative of the normal cone mapping generated by the constraint set of SOCPs without imposing any nondegeneracy condition;
- conducting a complete convergence analysis of the Augmented Lagrangian Method (ALM) for SOCPs with solvability, stability and local convergence analysis of both exact and inexact versions of the ALM under fairly mild assumptions.
These results have strong potentials for applications to SOCPs and related problems. Among those presented in this dissertation we mention characterizations of the uniqueness of Lagrange multipliers together with an error bound estimate for second-order cone constraints; of the isolated calmness property for solutions maps of perturbed variational systems associated with SOCPs; and also of (uniform) second-order growth condition for the augmented Lagrangian associated with SOCPs
Local properties and augmented Lagrangians in fully nonconvex composite optimization
A broad class of optimization problems can be cast in composite form, that
is, considering the minimization of the composition of a lower semicontinuous
function with a differentiable mapping. This paper discusses the versatile
template of composite optimization without any convexity assumptions. First-
and second-order optimality conditions are discussed, advancing the variational
analysis of compositions. We highlight the difficulties that stem from the lack
of convexity when dealing with necessary conditions in a Lagrangian framework
and when considering error bounds. Building upon these characterizations, a
local convergence analysis is delineated for a recently developed augmented
Lagrangian method, deriving rates of convergence in the fully nonconvex
setting.Comment: 42 page
Calculation of chemical and phase equilibria
Bibliography: pages 167-169.The computation of chemical and phase equilibria is an essential aspect of chemical engineering design and development. Important applications range from flash calculations to distillation and pyrometallurgy. Despite the firm theoretical foundations on which the theory of chemical equilibrium is based there are two major difficulties that prevent the equilibrium state from being accurately determined. The first of these hindrances is the inaccuracy or total absence of pertinent thermodynamic data. The second is the complexity of the required calculation. It is the latter consideration which is the sole concern of this dissertation
Bibliography on Nondifferentiable Optimization
This is a research bibliography with all the advantages and shortcomings that this implies. The author has used it as a bibliographical data base when writing papers, and it is therefore largely a reflection of his own personal research interests. However, it is hoped that this bibliography will nevertheless be of use to others interested in nondifferentiable optimization
Duality Results for Conic Convex Programming
This paper presents a unified study of duality properties for the problem of minimizing a linear function over the intersection of an affine space with a convex cone in finite dimension. Existing duality results are carefully surveyed and some new duality properties are established. Examples are given to illustrate these new properties. The topics covered in this paper include Gordon-Stiemke type theorems, Farkas type theorems, perfect duality, Slater condition, regularization, Ramana's duality, and approximate dualities. The dual representations of various convex sets, convex cones and conic convex programs are also discussed
Strong Variational Sufficiency for Nonlinear Semidefinite Programming and its Implications
Strong variational sufficiency is a newly proposed property, which turns out
to be of great use in the convergence analysis of multiplier methods. However,
what this property implies for non-polyhedral problems remains a puzzle. In
this paper, we prove the equivalence between the strong variational sufficiency
and the strong second order sufficient condition (SOSC) for nonlinear
semidefinite programming (NLSDP), without requiring the uniqueness of
multiplier or any other constraint qualifications. Based on this
characterization, the local convergence property of the augmented Lagrangian
method (ALM) for NLSDP can be established under strong SOSC in the absence of
constraint qualifications. Moreover, under the strong SOSC, we can apply the
semi-smooth Newton method to solve the ALM subproblems of NLSDP as the positive
definiteness of the generalized Hessian of augmented Lagrangian function is
satisfied.Comment: 23 page
Good and Bad Optimization Models: Insights from Rockafellians
A basic requirement for a mathematical model is often that its solution (output) shouldn’t
change much if the model’s parameters (input) are perturbed. This is important because the exact values
of parameters may not be known and one would like to avoid being misled by an output obtained using
incorrect values. Thus, it’s rarely enough to address an application by formulating a model, solving the
resulting optimization problem and presenting the solution as the answer. One would need to confirm
that the model is suitable, i.e., “good,” and this can, at least in part, be achieved by considering a
family of optimization problems constructed by perturbing parameters as quantified by a Rockafellian
function. The resulting sensitivity analysis uncovers troubling situations with unstable solutions, which
we referred to as “bad” models, and indicates better model formulations. Embedding an actual problem
of interest within a family of problems via Rockafellians is also a primary path to optimality conditions
as well as computationally attractive, alternative problems, which under ideal circumstances, and when
properly tuned, may even furnish the minimum value of the actual problem. The tuning of these
alternative problems turns out to be intimately tied to finding multipliers in optimality conditions and
thus emerges as a main component of several optimization algorithms. In fact, the tuning amounts to
solving certain dual optimization problems. In this tutorial, we’ll discuss the opportunities and insights
afforded by Rockafellians.Office of Naval ResearchAir Force Office of Scientific ResearchMIPR F4FGA00350G004MIPR N0001421WX0149
- …