56 research outputs found
Globalization of Barzilai and Borwein Method for Unconstrained Optimization
The focus of this thesis is on finding the unconstrained minimizer of a function.
Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous
two-point stepsize gradient method. First we briefly give some mathematical
background. Then we discuss the (BB) method that is important in the area of
optimization. A review of the minimization methods currently available that can be
used to solve unconstrained optimization is also given.
Due to BB method’s simplicity, low storage and numerical efficiency, the Barzilai
and Borwein method has received a good deal of attention in the optimization
community but despite all these advances, stepsize of BB method is computed by
means of simple approximation of Hessian in the form of scalar multiple of identity
and especially the BB method is not monotone, and it is not easy to generalize the
method to general nonlinear functions. Due to the presence of these deficiencies, we
introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation (quasi-Cauchy relation), improved
Hessian approximation and scaling the diagonal updating.
The proposed methods are a kind of fixed step gradient method like that of Barzilai
and Borwein method. In contrast with the Barzilai and Borwein approach’s in which
stepsize is computed by means of simple approximation of the Hessian in the form of
scalar multiple of identity, the proposed methods consider approximation of Hessian
in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms
belong to the class of monotone gradient methods with globally convergence.
Numerical results suggest that for non-quadratic minimization problem, the new
methods clearly outperform the Barzilai- Borwein method.
Finally we comment on some achievement in our researches. Possible extensions are
also given to conclude this thesis
An improved multi-step gradient-type method for large scale optimization
In this paper, we propose an improved multi-step diagonal updating method for large scale unconstrained optimization. Our approach is based on constructing a new gradient-type method by means of interpolating curves. We measure the distances required to parameterize the interpolating polynomials via a norm defined by a positive-definite matrix. By developing on implicit updating approach we can obtain an improved version of Hessian approximation in diagonal matrix form, while avoiding the computational expenses of actually calculating the improved version of the approximation matrix. The effectiveness of our proposed method is evaluated by means of computational comparison with the BB method and its variants. We show that our method is globally convergent and only requires O(n) memory allocations
A monotone gradient method via weak secant equation for unconstrained optimization
In this paper we present a new algorithm of steepest descent type. A new technique for steplength computation and a monotone strategy are provided in the framework of the Barzilai and Borwein method. In contrast with Barzilai and Borwein approach's in which the steplength is computed by means of a simple approximation of the Hessian in the form of scalar multiple of identity and an interpretation of the secant equation, the new proposed algorithm considers another approximation of the Hessian based on the weak secant equation. By incorporating a simple monotone strategy, the resulting algorithm belongs to the class of monotone gradient methods with linearly convergence. Numerical results suggest that for non-quadratic minimization problem, the new method clearly outperforms the Barzilai-Borwein method
Improved Hessian approximation with modified quasi-Cauchy relation for a gradient-type method
In this work we develop a new gradient-type method with improved Hessian approximation for unconstrained optimization problems. The new method resembles the Barzilai-Borwein (BB) method, except that the Hessian matrix is approximated by a diagonal matrix rather than the
multiple of the identity matrix in the BB method. Then the diagonal Hessian approximation is derived based on the quasi-Cauchy relation. To further improve the Hessian approximation, we modify the quasi-Cauchy relation to carry some additional information from the values and gradients of the objective function. Numerical experiments show that the proposed method yields desirable improvement
A new gradient method via quasi-Cauchy relation which guarantees descent
We propose a new monotone algorithm for unconstrained optimization in the frame of Barzilai and Borwein (BB) method and analyze the convergence properties of this new descent method. Motivated by the fact that BB method does not guarantee descent in the objective function at each iteration, but performs better than the steepest descent method, we therefore attempt to find stepsize formula which enables us to approximate the Hessian based on the Quasi-Cauchy equation and possess monotone property in each iteration. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the BB method
Higher order curvature information and its application in a modified diagonal Secant method
A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in nonlinear optimization. Curvature information must satisfy the usual secant equation to ensure positive definiteness of the Hessian approximation. In this work, we present a new diagonal updating to improve the Hessian approximation with a modifying weak secant equation for the diagonal quasi-Newton (DQN) method. The gradient and function evaluation are utilized to obtain a new weak secant equation and achieve a higher order accuracy in curvature information in the proposed method. Modified DQN methods based on the modified weak secant equation are globally convergent. Extended numerical results indicate the advantages of modified DQN methods over the usual ones and some classical conjugate gradient methods
Diagonal quasi-Newton method via variational principle under generalized Frobenius norm
In this work, we present a new class of diagonal quasi-Newton methods for solving large-scale unconstrained optimization problems. The methods are derived by means of variational principle under the generalized Frobenius norm. We show global convergence of our methods under the standard line search with Armijo condition. Numerical results are carried out in standard test problems and clearly indicate vast superiority over some classical conjugate gradient methods
Accumulative Approach in Multistep Diagonal Gradient-Type Method for Large-Scale Unconstrained Optimization
This paper focuses on developing diagonal gradient-type methods that employ accumulative approach in multistep diagonal updating to determine a better Hessian approximation in each step. The interpolating curve is used to derive a generalization of the weak secant equation, which will carry the information of the local Hessian. The new parameterization of the interpolating curve in variable space is obtained by utilizing accumulative approach via a norm weighting defined by two positive definite weighting matrices. We also note that the storage needed for all computation of the proposed method is just O(n). Numerical results show that the proposed algorithm is efficient and superior by comparison with some other gradient-type methods
Quadruplet Heterotopic Pregnancy Following In Vitro Fertilization and Embryo Transfer with Laparotomic Removal of Ruptured Twin Tubal Ectopic Pregnancy: A Case Report
Heterotopic pregnancy (HP) is a rare occurrence in natural pregnancies. However, it can be a life-threatening condition and should be taken into account in all assisted reproductive treatments. Diagnosis and treatment of ectopic pregnancy are challenging issues in patients with HP. Here, we report a rare case of quadruplet HP following an in vitro fertilization-embryo transfer with a viable twin intrauterine pregnancy and ruptured live twin left tubal ectopic pregnancy. A 35-year-old woman (gravida 5, para 1, ectopic pregnancies 2, and abortion 1) was presented to the Emergency Department of Arash Women’s Hospital (Tehran, Iran) in March 2021 with abdominal pain. The patient was at six weeks and five days of pregnancy following in vitro fertilization-embryo transfer. Transvaginal sonography (TVS) revealed a live twin intrauterine pregnancy with a ruptured live twin left tubal ectopic pregnancy. The latter was removed via laparotomy to preserve the intrauterine pregnancy. The patient subsequently delivered a female infant at 38 weeks of pregnancy
A new two-step gradient-type method for large-scale unconstrained optimization
In this paper, we propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein (BB) method. The innovation features of this approach consist in using approximation of the Hessian in diagonal matrix form based on the modified weak secant equation rather than the multiple of the identity matrix in the BB method. Using this approach, we can obtain a higher order accuracy of Hessian approximation when compares to other existing BB-type method. By incorporating a simple monotone strategy, the global convergence of the new method is achieved. Practical insights into the effectiveness of the proposed method are given by numerical comparison with the BB method and its variant
- …