1,058 research outputs found
Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
In this paper, we aim to propose some spectral gradient methods via variational technique under log-determinant norm. The spectral parameters satisfy the modified weak secant relations that inspired by the multistep approximation for solving large scale unconstrained optimization. An executable code is developed to test the efficiency of the proposed method with spectral gradient method using standard weak secant relation as constraint. Numerical results are presented which suggest a better performance has been achieved
Convergence of symmetric rank-one method based on modified Quasi-Newton equation
In this paper we investigate on convergence rate of a modified symmetric rank-one (SR1) method for unconstrained
optimization problems. In general, the modified SR1 method incorporates a modified secant equation into the standard SR1 method. Also a restart procedure is applied to avoid the loss of positive definiteness and zero denominator. A remarkable feature of the modified SR1 method is that it possesses at most -step -superlinearly convergent and -step quadratic convergent without uniformly independent assumptions of steps
A modified secant method for unconstrained minimization
A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses Armijo gradient method iterations until it reaches a region where the Newton method is more efficient, and then switches over to a secant form of operation. It is concluded that an efficient method for unconstrained minimization has been developed, and that any convergent minimization method can be substituted for the Armijo gradient method
Modifications of the Limited Memory BFGS Algorithm for Large-scale Nonlinear Optimization
In this paper we present two new numerical methods for unconstrained large-scale optimization. These methods apply update formulae, which are derived by considering different techniques of approximating the objective function. Theoretical analysis is given to show the advantages of using these update formulae. It is observed that these update formulae can be employed within the framework of limited memory strategy with only a modest increase in the linear algebra cost. Comparative results with limited memory BFGS (L-BFGS) method are presented.</p
Some Unconstrained Optimization Methods
Although it is a very old theme, unconstrained optimization is an area which is always actual for many scientists. Today, the results of unconstrained optimization are applied in different branches of science, as well as generally in practice. Here, we present the line search techniques. Further, in this chapter we consider some unconstrained optimization methods. We try to present these methods but also to present some contemporary results in this area
A dai-liao hybrid hestenes-stiefel and fletcher-revees methods for unconstrained optimization
Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems
A dai-liao hybrid conjugate gradient method for unconstrained optimization
One of todays’ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter  and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of  remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy condition. Therefore, this paper suggests a new hybrid CG that combines the strength of Liu and Storey and Conjugate Descent CG methods by retaining a choice of Dai-Liao parameterthat is optimal. The theoretical analysis indicated that the search direction of the new CG scheme is descent and satisfies sufficient descent condition when the iterates jam under strong Wolfe line search. The algorithm is shown to converge globally using standard assumptions. The numerical experimentation of the scheme demonstrated that the proposed method is robust and promising than some known methods applying the performance profile Dolan and Mor´e on 250 unrestricted problems. Numerical assessment of the tested CG algorithms with sparse signal reconstruction and image restoration in compressive sensing problems, file restoration, image video coding and other applications. The result shows that these CG schemes are comparable and can be applied in different fields such as temperature, fire, seismic sensors, and humidity detectors in forests, using wireless sensor network techniques
- …