11 research outputs found

    A curvilinear search using tridiagonal secant updates for unconstrained optimization

    Get PDF
    The idea of doing a curvilinear search along the Levenberg- Marquardt path s(μ) = - (H + μI)⁻¹g always has been appealing, but the cost of solving a linear system for each trial value of the parameter y has discouraged its implementation. In this paper, an algorithm for searching along a path which includes s(μ) is studied. The algorithm uses a special inexpensive QTcQT to QT₊QT Hessian update which trivializes the linear algebra required to compute s(μ). This update is based on earlier work of Dennis-Marwil and Martinez on least-change secant updates of matrix factors. The new algorithm is shown to be local and q-superlinearily convergent to stationary points, and to be globally q-superlinearily convergent for quasi-convex functions. Computational tests are given that show the new algorithm to be robust and efficient.Facultad de Ciencias Exacta

    Scaling rank-one updating formula and its application in unconstrained optimization

    No full text
    This thesis deals with algorithms used to solve unconstrained optimization problems. We analyse the properties of a scaling symmetric rank one (SSRl) update, prove the convergence of the matrices generated by SSRl to the true Hessian matrix and show that algorithm SSRl possesses the quadratic termination property with inexact line search. A new algorithm (OCSSRl) is presented, in which the scaling parameter in SSRl is choosen automatically by satisfying Davidon's criterion for an optimaly conditioned Hessian estimate. Numerical tests show that the new method compares favourably with BFGS. Using the OCSSRl update, we propose a hybrid QN algorithm which does not need to store any matrix. Numerical results show that it is a very promising method for solving large scale optimization problems. In addition, some popular technologies in unconstrained optimization are also discussed, for example, the trust region step, the descent direction with supermemory and. the detection of large residual in nonlinear least squares problems. The thesis consists of two parts. The first part gives a brief survey of unconstrained optimization. It contains four chapters, and introduces basic results on unconstrained optimization, some popular methods and their properties based on quadratic approximations to the objective function, some methods which are suitable for solving large scale optimization problems and some methods for solving nonlinear least squares problems. The second part outlines the new research results, and containes five chapters, In Chapter 5, the scaling rank one updating formula is analysed and studied. Chapter 6, Chapter 7 and Chapter 8 discuss the applications for the trust region method, large scale optimization problems and nonlinear least squares. A final chapter summarizes the problems used in numerical testing

    Cutting planes and a biased Newton direction for minimizing quasiconvex functions

    Get PDF
    A biased Newton direction is introduced for minimizing quasiconver functions with bounded level sets. It is a generalization of the usual Newton’s direction for strictly convex quadratic functions. This new direction can be derived from the intersection of approzimating hyperplanes to the epigraph at points on the boundary of the same level set. Based on that direction, an unconstrained minimization algorithm is presented. It is proved to have global and local-quadratic convergence under standard hypotheses. These theoretical results may lead to different methods based on computing search directions using only first order information at points on the level sets. Most of all if the computational cost can be reduced by relaxing some of the conditions according for instance to the results presented in the Appendix. Some tests are presented to show the qualitative behavior of the new direction and with the purpose to stimulate further research on these kind of algorithms.Facultad de Ciencias Exacta

    Third International Conference on Inverse Design Concepts and Optimization in Engineering Sciences (ICIDES-3)

    Get PDF
    Papers from the Third International Conference on Inverse Design Concepts and Optimization in Engineering Sciences (ICIDES) are presented. The papers discuss current research in the general field of inverse, semi-inverse, and direct design and optimization in engineering sciences. The rapid growth of this relatively new field is due to the availability of faster and larger computing machines

    MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications

    Get PDF
    Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described

    Generalized averaged Gaussian quadrature and applications

    Get PDF
    A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal
    corecore