25 research outputs found
Exploiting damped techniques for nonlinear conjugate gradient methods
In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi–Newton methods. We extend their use to NCG methods in large scale unconstrained optimization, aiming at possibly improving the efficiency and the robustness of the latter methods, especially when solving difficult problems. We consider both unpreconditioned and Preconditioned NCG (PNCG). In the latter case, we embed damped techniques within a class of preconditioners based on quasi–Newton updates. Our purpose is to possibly provide efficient preconditioners which approximate, in some sense, the inverse of the Hessian matrix, while still preserving information provided by the secant equation or some of its modifications. The results of an extensive numerical experience highlights that the proposed approach is quite promising
Quasi-Newton-Based Preconditioning and Damped Quasi-Newton Schemes for Nonlinear Conjugate Gradient Methods
In this paper, we deal with matrix-free preconditioners for Nonlinear Conjugate Gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi-Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods
ムセイヤク サイテキカ モンダイ ニ タイスル コウカ ホウコウ オ セイセイ スル カクチョウ 3コウ キョウヤク コウバイホウ ノ タイイキテキ シュウソクセイ サイテキカ ノ キソ リロン ト オウヨウ
A Note on Using Partitioning Techniques for Solving Unconstrained Optimization Problems on Parallel Systems
We deal with the design of parallel algorithms by using variable partitioning techniques to solve nonlinear optimization problems. We propose an iterative solution method that is very efficient for separable functions, our scope being to discuss its performance for general functions. Experimental results on an illustrative example have suggested some useful modifications that, even though they improve the efficiency of our parallel method, leave some questions open for further investigation.
Numerical Experience with Damped Quasi-Newton Optimization Methods when the Objective Function is Quadratic
A class of damped quasi-Newton methods for nonlinear optimization has recently been proposed by extending the damped-technique of Powell for the BFGS method to the Broyden family of quasi-Newton methods. It has been shown that this damped class has the global and superlinear convergence property that a restricted class of 'undamped' methods has for convex objective functions in unconstrained optimization. To test this result, we applied several members of the Broyden family and their corresponding damped methods to a simple quadratic function and observed several useful features of the damped-technique. These observations and other numerical experiences are described in this paper. The important role of the damped-technique is shown not only for enforcing the above convergence property, but also for improving the performance of efficient, inefficient and divergent undamped methods substantially (significantly in the latter case). Thus, some appropriate ways for employing the damped-technique are suggested
Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
Damped Techniques for the Limited Memory BFGS Method for Large-Scale Optimization
This paper is aimed to extend a certain damped technique, suitable for the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, to the limited memory BFGS method in the case of the large-scale unconstrained optimization. It is shown that the proposed technique maintains the global convergence property on uniformly convex functions for the limited memory BFGS method. Some numerical results are described to illustrate the important role of the damped technique. Since this technique enforces safely the positive definiteness property of the BFGS update for any value of the steplength, we also consider only the first Wolfe–Powell condition on the steplength. Then, as for the backtracking framework, only one gradient evaluation is performed on each iteration. It is reported that the proposed damped methods work much better than the limited memory BFGS method in several cases
3rd International Conference on Numerical Analysis and Optimization : Theory, Methods, Applications and Technology Transfer
Presenting the latest findings in the field of numerical analysis and optimization, this volume balances pure research with practical applications of the subject. Accompanied by detailed tables, figures, and examinations of useful software tools, this volume will equip the reader to perform detailed and layered analysis of complex datasets. Many real-world complex problems can be formulated as optimization tasks. Such problems can be characterized as large scale, unconstrained, constrained, non-convex, non-differentiable, and discontinuous, and therefore require adequate computational methods, algorithms, and software tools. These same tools are often employed by researchers working in current IT hot topics such as big data, optimization and other complex numerical algorithms on the cloud, devising special techniques for supercomputing systems. The list of topics covered include, but are not limited to: numerical analysis, numerical optimization, numerical linear algebra, numerical differential equations, optimal control, approximation theory, applied mathematics, algorithms and software developments, derivative free optimization methods and programming models. The volume also examines challenging applications to various types of computational optimization methods which usually occur in statistics, econometrics, finance, physics, medicine, biology, engineering and industrial sciences
