6 research outputs found

    Hybrid Newton-type method for a class of semismooth equations

    Get PDF
    In this paper, we present a hybrid method for the solution of a class of composite semismooth equations encountered frequently in applications. The method is obtained by combining a generalized finite-difference Newton method to an inexpensive direct search method. We prove that, under standard assumptions, the method is globally convergent with a local rate of convergence which is superlinear or quadratic. We report also several numerical results obtained applying the method to suitable reformulations of well-known nonlinear complementarity problem

    A class of Steffensen type methods with optimal order of convergente

    Full text link
    In this paper, a family of Steffensen type methods of fourth-order convergence for solving nonlinear smooth equations is suggested. In the proposed methods, a linear combination of divided differences is used to get a better approximation to the derivative of the given function. Each derivative-free member of the family requires only three evaluations of the given function per iteration. Therefore, this class of methods has efficiency index equal to 1.587. Kung and Traub conjectured that the order of convergence of any multipoint method without memory cannot exceed the bound 2d-1, where d is the number of functional evaluations per step. The new class of methods agrees with this conjecture for the case d=3. Numerical examples are made to show the performance of the presented methods, on smooth and nonsmooth equations, and to compare with other ones. © 2011 Elsevier Inc. All rights reserved.This research was supported by Ministerio de Ciencia y Tecnologia MTM2010-18539.Cordero Barbero, A.; Torregrosa Sánchez, JR. (2011). A class of Steffensen type methods with optimal order of convergente. Applied Mathematics and Computation. 217(19):7653-7659. https://doi.org/10.1016/j.amc.2011.02.067S765376592171

    Strong semismoothness of eigenvalues of symmetric matrices and its application to inverse eigenvalue problems

    Get PDF
    10.1137/S0036142901393814SIAM Journal on Numerical Analysis4062352-2367SJNA

    An efficient sieving based secant method for sparse optimization problems with least-squares constraints

    Full text link
    In this paper, we propose an efficient sieving based secant method to address the computational challenges of solving sparse optimization problems with least-squares constraints. A level-set method has been introduced in [X. Li, D.F. Sun, and K.-C. Toh, SIAM J. Optim., 28 (2018), pp. 1842--1866] that solves these problems by using the bisection method to find a root of a univariate nonsmooth equation φ(λ)=ϱ\varphi(\lambda) = \varrho for some ϱ>0\varrho > 0, where φ()\varphi(\cdot) is the value function computed by a solution of the corresponding regularized least-squares optimization problem. When the objective function in the constrained problem is a polyhedral gauge function, we prove that (i) for any positive integer kk, φ()\varphi(\cdot) is piecewise CkC^k in an open interval containing the solution λ\lambda^* to the equation φ(λ)=ϱ\varphi(\lambda) = \varrho; (ii) the Clarke Jacobian of φ()\varphi(\cdot) is always positive. These results allow us to establish the essential ingredients of the fast convergence rates of the secant method. Moreover, an adaptive sieving technique is incorporated into the secant method to effectively reduce the dimension of the level-set subproblems for computing the value of φ()\varphi(\cdot). The high efficiency of the proposed algorithm is demonstrated by extensive numerical results

    Nonsmooth and derivative-free optimization based hybrid methods and applications

    Get PDF
    "In this thesis, we develop hybrid methods for solving global and in particular, nonsmooth optimization problems. Hybrid methods are becoming more popular in global optimization since they allow to apply powerful smooth optimization techniques to solve global optimization problems. Such methods are able to efficiently solve global optimization problems with large number of variables. To date global search algorithms have been mainly applied to improve global search properties of the local search methods (including smooth optimization algorithms). In this thesis we apply rather different strategy to design hybrid methods. We use local search algorithms to improve the efficiency of global search methods. The thesis consists of two parts. In the first part we describe hybrid algorithms and in the second part we consider their various applications." -- taken from Abstract.Operational Research and Cybernetic
    corecore