66 research outputs found

    Using gradient directions to get global convergence of Newton-type methods

    Full text link
    The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [IMA Journal of Numerical Analysis, 8 (1988)] has driven us to consider a globalization strategy based on SD, which is applicable to any line-search method. In particular, we combine Newton-type directions with scaled SD steps to have suitable descent directions. Scaling the SD directions with a suitable step length makes a significant difference with respect to similar globalization approaches, in terms of both theoretical features and computational behavior. We apply our strategy to Newton's method and the BFGS method, with computational results that appear interesting compared with the results of well-established globalization strategies devised ad hoc for those methods.Comment: 22 pages, 11 Figure

    Distributed Coordinate Descent for L1-regularized Logistic Regression

    Full text link
    Solving logistic regression with L1-regularization in distributed settings is an important problem. This problem arises when training dataset is very large and cannot fit the memory of a single machine. We present d-GLMNET, a new algorithm solving logistic regression with L1-regularization in the distributed settings. We empirically show that it is superior over distributed online learning via truncated gradient
    • …
    corecore