6 research outputs found

    Data and performance of an active-set truncated Newton method with non-monotone line search for bound-constrained optimization

    Get PDF
    In this data article, we report data and experiments related to the research article entitled “A Two-Stage Active-Set Algorithm for Bound-Constrained Optimization”, by Cristofari et al. (2017). The method proposed in Cristofari et al. (2017), tackles optimization problems with bound constraints by properly combining an active-set estimate with a truncated Newton strategy. Here, we report the detailed numerical experience performed over a commonly used test set, namely CUTEst (Gould et al., 2015). First, the algorithm ASA-BCP proposed in Cristofari et al. (2017) is compared with the related method NMBC (De Santis et al., 2012). Then, a comparison with the renowned methods ALGENCAN (Birgin and Martínez et al., 2002) and LANCELOT B (Gould et al., 2003) is reported

    An Active-Set Algorithmic Framework for Non-Convex Optimization Problems over the Simplex

    Get PDF
    In this paper, we describe a new active-set algorithmic framework for minimizing a non-convex function over the unit simplex. At each iteration, the method makes use of a rule for identifying active variables (i.e., variables that are zero at a stationary point) and specific directions (that we name active-set gradient related directions) satisfying a new "nonorthogonality" type of condition. We prove global convergence to stationary points when using an Armijo line search in the given framework. We further describe three different examples of active-set gradient related directions that guarantee linear convergence rate (under suitable assumptions). Finally, we report numerical experiments showing the effectiveness of the approach.Comment: 29 pages, 3 figure

    A Fast Active Set Block Coordinate Descent Algorithm for â„“1\ell_1-regularized least squares

    Get PDF
    The problem of finding sparse solutions to underdetermined systems of linear equations arises in several applications (e.g. signal and image processing, compressive sensing, statistical inference). A standard tool for dealing with sparse recovery is the â„“1\ell_1-regularized least-squares approach that has been recently attracting the attention of many researchers. In this paper, we describe an active set estimate (i.e. an estimate of the indices of the zero variables in the optimal solution) for the considered problem that tries to quickly identify as many active variables as possible at a given point, while guaranteeing that some approximate optimality conditions are satisfied. A relevant feature of the estimate is that it gives a significant reduction of the objective function when setting to zero all those variables estimated active. This enables to easily embed it into a given globally converging algorithmic framework. In particular, we include our estimate into a block coordinate descent algorithm for â„“1\ell_1-regularized least squares, analyze the convergence properties of this new active set method, and prove that its basic version converges with linear rate. Finally, we report some numerical results showing the effectiveness of the approach.Comment: 28 pages, 5 figure

    Hybrid Random/Deterministic Parallel Algorithms for Nonconvex Big Data Optimization

    Full text link
    We propose a decomposition framework for the parallel optimization of the sum of a differentiable {(possibly nonconvex)} function and a nonsmooth (possibly nonseparable), convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. The main contribution of this work is a novel \emph{parallel, hybrid random/deterministic} decomposition scheme wherein, at each iteration, a subset of (block) variables is updated at the same time by minimizing local convex approximations of the original nonconvex function. To tackle with huge-scale problems, the (block) variables to be updated are chosen according to a \emph{mixed random and deterministic} procedure, which captures the advantages of both pure deterministic and random update-based schemes. Almost sure convergence of the proposed scheme is established. Numerical results show that on huge-scale problems the proposed hybrid random/deterministic algorithm outperforms both random and deterministic schemes.Comment: The order of the authors is alphabetica

    Active-set identification with complexity guarantees of an almost cyclic 2-coordinate descent method with Armijo line search

    Get PDF
    In this paper, it is established finite active-set identification of an almost cyclic 2-coordinate descent method for problems with one linear coupling constraint and simple bounds. First, general active-set identification results are stated for non-convex objective functions. Then, under convexity and a quadratic growth condition (satisfied by any strongly convex function), complexity results on the number of iterations required to identify the active set are given. In our analysis, a simple Armijo line search is used to compute the stepsize, thus not requiring exact minimizations or additional information
    corecore