1,107 research outputs found
A sequential semidefinite programming method and an application in passive reduced-order modeling
We consider the solution of nonlinear programs with nonlinear
semidefiniteness constraints. The need for an efficient exploitation of the
cone of positive semidefinite matrices makes the solution of such nonlinear
semidefinite programs more complicated than the solution of standard nonlinear
programs. In particular, a suitable symmetrization procedure needs to be chosen
for the linearization of the complementarity condition. The choice of the
symmetrization procedure can be shifted in a very natural way to certain linear
semidefinite subproblems, and can thus be reduced to a well-studied problem.
The resulting sequential semidefinite programming (SSP) method is a
generalization of the well-known SQP method for standard nonlinear programs. We
present a sensitivity result for nonlinear semidefinite programs, and then
based on this result, we give a self-contained proof of local quadratic
convergence of the SSP method. We also describe a class of nonlinear
semidefinite programs that arise in passive reduced-order modeling, and we
report results of some numerical experiments with the SSP method applied to
problems in that class
Interior-point methods for Pā(Īŗ)-linear complementarity problem based on generalized trigonometric barrier function
Recently, M.~Bouafoa, et al. investigated a new kernel function which differs from the self-regular kernel functions. The kernel function has a trigonometric Barrier Term. In this paper we generalize the analysis presented in the above paper for Linear Complementarity Problems (LCPs). It is shown that the iteration bound for primal-dual large-update and small-update interior-point methods based on this function is as good as the currently best known iteration bounds for these type methods. The analysis for LCPs deviates significantly from the analysis for linear optimization. Several new tools and techniques are derived in this paper.publishedVersio
Interior Point Methods for Massive Support Vector Machines
We investigate the use of interior point methods for solving quadratic
programming problems with a small number of linear constraints where
the quadratic term consists of a low-rank update to a positive semi-de nite
matrix. Several formulations of the support vector machine t into this
category. An interesting feature of these particular problems is the vol-
ume of data, which can lead to quadratic programs with between 10 and
100 million variables and a dense Q matrix. We use OOQP, an object-
oriented interior point code, to solve these problem because it allows us
to easily tailor the required linear algebra to the application. Our linear
algebra implementation uses a proximal point modi cation to the under-
lying algorithm, and exploits the Sherman-Morrison-Woodbury formula
and the Schur complement to facilitate e cient linear system solution.
Since we target massive problems, the data is stored out-of-core and we
overlap computation and I/O to reduce overhead. Results are reported
for several linear support vector machine formulations demonstrating the
reliability and scalability of the method
Introducing Interior-Point Methods for Introductory Operations Research Courses and/or Linear Programming Courses
In recent years the introduction and development of Interior-Point Methods has had a profound impact on optimization theory as well as practice, influencing the field of Operations Research and related areas. Development of these methods has quickly led to the design of new and efficient optimization codes particularly for Linear Programming. Consequently, there has been an increasing need to introduce theory and methods of this new area in optimization into the appropriate undergraduate and first year graduate courses such as introductory Operations Research and/or Linear Programming courses, Industrial Engineering courses and Math Modeling courses. The objective of this paper is to discuss the ways of simplifying the introduction of Interior-Point Methods for students who have various backgrounds or who are not necessarily mathematics majors
- ā¦