341 research outputs found

    The convergence of a one-step smoothing Newton method for P0-NCP based on a new smoothing NCP-function

    Get PDF
    AbstractThe nonlinear complementarity problem (denoted by NCP(F)) can be reformulated as the solution of a nonsmooth system of equations. By introducing a new smoothing NCP-function, the problem is approximated by a family of parameterized smooth equations. A one-step smoothing Newton method is proposed for solving the nonlinear complementarity problem with P0-function (P0-NCP) based on the new smoothing NCP-function. The proposed algorithm solves only one linear system of equations and performs only one line search per iteration. Without requiring strict complementarity assumption at the P0-NCP solution, the proposed algorithm is proved to be convergent globally and superlinearly under suitable assumptions. Furthermore, the algorithm has local quadratic convergence under mild conditions

    A New Inexact Non-Interior Continuation Algorithm for Second-Order Cone Programming

    Get PDF
    Second-order cone programming has received considerable attention in the past decades because of its wide range of applications. Non-interior continuation method is one of the most popular and efficient methods for solving second-order cone programming partially due to its superior numerical performances. In this paper, a new smoothing form of the well-known Fischer-Burmeister function is given. Based on the new smoothing function, an inexact non-interior continuation algorithm is proposed. Attractively, the new algorithm can start from an arbitrary point, and it solves only one system of linear equations inexactly and performs only one line search at each iteration. Moreover, under a mild assumption, the new algorithm has a globally linear and locally Q-quadratical convergence. Finally, some preliminary numerical results are reported which show the effectiveness of the presented algorithm

    Convergence analysis of generalized iteratively reweighted least squares algorithms on convex function spaces

    Get PDF
    The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and includes the iteratively reweighted least squares algorithm as a special case. We prove convergence on convex function spaces for general coercive and convex functionals F and derive geometric convergence in certain unconstrained settings. The algorithm is applied to TV penalized quantile regression and is compared with a step size corrected Newton-Raphson algorithm. It is found that typically in the first steps the iteratively reweighted least squares algorithm performs significantly better, whereas the Newton type method outpaces the former only after many iterations. Finally, in the setting of bivariate regression with unimodality constraints we illustrate how this algorithm allows to utilize highly efficient algorithms for special quadratic programs in more complex settings. --regression analysis,monotone regression,quantile regression,shape constraints,L1 regression,nonparametric regression,total variation semi-norm,reweighted least squares,Fermat's problem,convex approximation,quadratic approximation,pool adjacent violators algorithm

    Conic Optimization Theory: Convexification Techniques and Numerical Algorithms

    Full text link
    Optimization is at the core of control theory and appears in several areas of this field, such as optimal control, distributed control, system identification, robust control, state estimation, model predictive control and dynamic programming. The recent advances in various topics of modern optimization have also been revamping the area of machine learning. Motivated by the crucial role of optimization theory in the design, analysis, control and operation of real-world systems, this tutorial paper offers a detailed overview of some major advances in this area, namely conic optimization and its emerging applications. First, we discuss the importance of conic optimization in different areas. Then, we explain seminal results on the design of hierarchies of convex relaxations for a wide range of nonconvex problems. Finally, we study different numerical algorithms for large-scale conic optimization problems.Comment: 18 page

    A squared smoothing Newton method for nonsmooth matrix equations and its applications in semidefinite optimization problems

    Get PDF
    10.1137/S1052623400379620SIAM Journal on Optimization143783-80
    corecore