2,435 research outputs found

    The Davidon-Fletcher-Powell penalty function method: A generalized iterative technique for solving parameter optimization problems

    Get PDF
    The Fletcher-Powell version of the Davidon variable metric unconstrained minimization technique is described. Equations that have been used successfully with the Davidon-Fletcher-Powell penalty function technique for solving constrained minimization problems and the advantages and disadvantages of using them are discussed. The experience gained in the behavior of the method while iterating is also related

    A dai-liao hybrid hestenes-stiefel and fletcher-revees methods for unconstrained optimization

    Get PDF
    Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems

    A Spectral Dai-Yuan-Type Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems

    Improved Fletcher-Reeves Methods Based on New Scaling Techniques

    Get PDF
    This paper introduces a scaling parameter to the Fletcher-Reeves (FR) nonlinear conjugate gradient method. The main aim is to improve its theoretical and numerical properties when applied with inexact line searches to unconstrained optimization problems. We show that the sufficient descent and global convergence properties of Al-Baali for the FR method with a fairly accurate line search are maintained. We also consider the possibility of extending this result to less accurate line search for appropriate values of the scaling parameter. The reported numerical results show that several values for the proposed scaling parameter improve the performance of the FR method significantly

    An enhanced fletcher-reeves-like conjugate gradient methods for image restoration

    Get PDF
    Noise is an unavoidable aspect of modern camera technology, causing a decline in the overall visual quality of the images. Efforts are underway to diminish noise without compromising essential image features like edges, corners, and other intricate structures. Numerous techniques have already been suggested by many researchers for noise reduction, each with its unique set of benefits and drawbacks. Denoising images is a basic challenge in image processing. We describe a two-phase approach for removing impulse noise in this study. The adaptive median filter (AMF) for salt-and-pepper noise identifies noise candidates in the first phase. The second step minimizes an edge-preserving regularization function using a novel hybrid conjugate gradient approach. To generate the new improved search direction, the new algorithm takes advantage of two well-known successful conjugate gradient techniques. The descent property and global convergence are proven for the new methods. The obtained numerical results reveal that, when applied to image restoration, the new algorithms are superior to the classical fletcher reeves (FR) method in the same domain in terms of maintaining image quality and efficiency

    New Algebraic Formulation of Density Functional Calculation

    Full text link
    This article addresses a fundamental problem faced by the ab initio community: the lack of an effective formalism for the rapid exploration and exchange of new methods. To rectify this, we introduce a novel, basis-set independent, matrix-based formulation of generalized density functional theories which reduces the development, implementation, and dissemination of new ab initio techniques to the derivation and transcription of a few lines of algebra. This new framework enables us to concisely demystify the inner workings of fully functional, highly efficient modern ab initio codes and to give complete instructions for the construction of such for calculations employing arbitrary basis sets. Within this framework, we also discuss in full detail a variety of leading-edge ab initio techniques, minimization algorithms, and highly efficient computational kernels for use with scalar as well as shared and distributed-memory supercomputer architectures

    The Global Convergence of a New Mixed Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    We propose and generalize a new nonlinear conjugate gradient method for unconstrained optimization. The global convergence is proved with the Wolfe line search. Numerical experiments are reported which support the theoretical analyses and show the presented methods outperforming CGDESCENT method

    Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses

    Get PDF
    Conjugate gradient methods are important first-order optimization algorithms both in Euclidean spaces and on Riemannian manifolds. However, while various types of conjugate gradient methods have been studied in Euclidean spaces, there are relatively fewer studies for those on Riemannian manifolds (i.e., Riemannian conjugate gradient methods). This paper proposes a novel general framework that unifies existing Riemannian conjugate gradient methods such as the ones that utilize a vector transport or inverse retraction. The proposed framework also develops other methods that have not been covered in previous studies. Furthermore, conditions for the convergence of a class of algorithms in the proposed framework are clarified. Moreover, the global convergence properties of several specific types of algorithms are extensively analyzed. The analysis provides the theoretical results for some algorithms in a more general setting than the existing studies and new developments for other algorithms. Numerical experiments are performed to confirm the validity of the theoretical results. The experimental results are used to compare the performances of several specific algorithms in the proposed framework
    corecore