6 research outputs found

    Optimization and Equilibrium Problems with Equilibrium Constraints in Infinite-Dimensional Spaces

    Get PDF
    The paper is devoted to applications of modern variational f).nalysis to the study of constrained optimization and equilibrium problems in infinite-dimensional spaces. We pay a particular attention to the remarkable classes of optimization and equilibrium problems identified as MPECs (mathematical programs with equilibrium constraints) and EPECs (equilibrium problems with equilibrium constraints) treated from the viewpoint of multiobjective optimization. Their underlying feature is that the major constraints are governed by parametric generalized equations/variational conditions in the sense of Robinson. Such problems are intrinsically nonsmooth and can be handled by using an appropriate machinery of generalized differentiation exhibiting a rich/full calculus. The case of infinite-dimensional spaces is significantly more involved in comparison with finite dimensions, requiring in addition a certain sufficient amount of compactness and an efficient calculus of the corresponding sequential normal compactness (SNC) properties

    Robust Stability and Optimality Conditions for Parametric Infinite and Semi-Infinite Programs

    Get PDF
    This paper primarily concerns the study of parametric problems of infinite and semi-infinite programming, where functional constraints are given by systems of infinitely many linear inequalities indexed by an arbitrary set T, where decision variables run over Banach (infinite programming) or finite-dimensional (semi-infinite case) spaces, and where objectives are generally described by nonsmooth and nonconvex cost functions. The parameter space of admissible perturbations in such problems is formed by all bounded functions on T equipped with the standard supremum norm. Unless the index set T is finite, this space is intrinsically infinite-dimensional (nonreflexive and nonseparable) of the l(infinity)-type. By using advanced tools of variational analysis and generalized differentiation and largely exploiting underlying specific features of linear infinite constraints, we establish complete characterizations of robust Lipschitzian stability (with computing the exact bound of Lipschitzian moduli) for parametric maps of feasible solutions governed by linear infinite inequality systems and then derive verifiable necessary optimality conditions for the infinite and semi-infinite programs under consideration expressed in terms of their initial data. A crucial part of our analysis addresses the precise computation of coderivatives and their norms for infinite systems of parametric linear inequalities in general Banach spaces of decision variables. The results obtained are new in both frameworks of infinite and semi-infinite programming

    Global Convergence of Damped Newton's Method for Nonsmooth Equations, via the Path Search

    Get PDF
    A natural damping of Newton's method for nonsmooth equations is presented. This damping, via the path search instead of the traditional line search, enlarges the domain of convergence of Newton's method and therefore is said to be globally convergent. Convergence behavior is like that of line search damped Newton's method for smooth equations, including Q-quadratic convergence rates under appropriate conditions. Applications of the path search include damping Robinson-Newton's method for nonsmooth normal equations corresponding to nonlinear complementarity problems and variational inequalities, hence damping both Wilson's method (sequential quadratic programming) for nonlinear programming and Josephy-Newton's method for generalized equations. Computational examples from nonlinear programming are given

    Nondifferentiable Optimization: Motivations and Applications

    Get PDF
    IIASA has been involved in research on nondifferentiable optimization since 1976. The Institute's research in this field has been very productive, leading to many important theoretical, algorithmic and applied results. Nondifferentiable optimization has now become a recognized and rapidly developing branch of mathematical programming. To continue this tradition and to review developments in this field IIASA held this Workshop in Sopron (Hungary) in September 1984. This volume contains selected papers presented at the Workshop. It is divided into four sections dealing with the following topics: (I) Concepts in Nonsmooth Analysis; (II) Multicriteria Optimization and Control Theory; (III) Algorithms and Optimization Methods; (IV) Stochastic Programming and Applications

    Nonsmooth dynamic optimization of systems with varying structure

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 357-365).In this thesis, an open-loop numerical dynamic optimization method for a class of dynamic systems is developed. The structure of the governing equations of the systems under consideration change depending on the values of the states, parameters and the controls. Therefore, these systems are called systems with varying structure. Such systems occur frequently in the models of electric and hydraulic circuits, chemical processes, biological networks and machinery. As a result, the determination of parameters and controls resulting in the optimal performance of these systems has been an important research topic. Unlike dynamic optimization problems where the structure of the underlying system is constant, the dynamic optimization of systems with varying structure requires the determination of the optimal evolution of the system structure in time in addition to optimal parameters and controls. The underlying varying structure results in nonsmooth and discontinuous optimization problems. The nonsmooth single shooting method introduced in this thesis uses concepts from nonsmooth analysis and nonsmooth optimization to solve dynamic optimization problems involving systems with varying structure whose dynamics can be described by locally Lipschitz continuous ordinary or differential-algebraic equations. The method converts the infinitedimensional dynamic optimization problem into an nonlinear program by parameterizing the controls. Unlike the state of the art, the method does not enumerate possible structures explicitly in the optimization and it does not depend on the discretization of the dynamics. Instead, it uses a special integration algorithm to compute state trajectories and derivative information. As a result, the method produces more accurate solutions to problems where the underlying dynamics is highly nonlinear and/or stiff for less effort than the state of the art. The thesis develops substitutes for the gradient and the Jacobian of a function in case these quantities do not exist. These substitutes are set-valued maps and an elements of these maps need to be computed for optimization purposes. Differential equations are derived whose solutions furnish the necessary elements. These differential equations have discontinuities in time. A numerical method for their solution is proposed based on state event location algorithms that detects these discontinuities. Necessary conditions of optimality for nonlinear programs are derived using these substitutes and it is shown that nonsmooth optimization methods called bundle methods can be used to obtain solutions satisfying these necessary conditions. Case studies compare the method to the state of the art and investigate its complexity empirically.by Mehmet Yunt.Ph.D

    Advances in Evolutionary Algorithms

    Get PDF
    With the recent trends towards massive data sets and significant computational power, combined with evolutionary algorithmic advances evolutionary computation is becoming much more relevant to practice. Aim of the book is to present recent improvements, innovative ideas and concepts in a part of a huge EA field
    corecore