51 research outputs found

    Feature selection combining linear support vector machines and concave optimization

    Get PDF
    In this work we consider feature selection for two-class linear models, a challenging task arising in several real-world applications. Given an unknown functional dependency that assigns a given input to the class to which it belongs, and that can be modelled by a linear machine, we aim to find the relevant features of the input space, namely we aim to detect the smallest number of input variables while granting no loss in classification accuracy. Our main motivation lies in the fact that the detection of the relevant features provides a better understanding of the underlying phenomenon, and this can be of great interest in important fields such as medicine and biology. Feature selection involves two competing objectives: the prediction capability (to be maximized) of the linear classifier and the number of features (to be minimized) employed by the classifier. In order to take into account both the objectives, we propose a feature selection strategy based on the combination of support vector machines (for obtaining good classifiers) with a concave optimization approach (for finding sparse solutions). We report results of an extensive computational experience showing the efficiency of the proposed methodology

    On the Global Convergence of Derivative Free Methods for Unconstrained Optimization.

    Get PDF

    On the Global Convergence of Derivative Free Methods for Unconstrained Optimization.

    Get PDF
    In this paper, starting from the study of the common elements that some globally convergent direct search methods share, a general convergence theory is established for unconstrained minimization methods employing only function values. The introduced convergence conditions are useful for developing and analyzing new derivative-free algorithms with guaranteed global convergence. As examples, we describe three new algorithms which combine pattern and line search approaches

    Nonmonotone globalization of the finite-difference Newton-GMRES method for nonlinear equations.

    Get PDF
    In this paper, we study nonmonotone globalization strategies, in connection with the finite-difference inexact Newton-GMRES method for nonlinear equations. We first define a globalization algorithm that combines nonmonotone watchdog rules and nonmonotone derivative-free linesearches related to a merit function, and prove its global convergence under the assumption that the Jacobian is nonsingular and that the iterations of the GMRES subspace method can be completed at each step. Then we introduce a hybrid stabilization scheme employing occasional line searches along positive bases, and establish global convergence towards a solution of the system, under the less demanding condition that the Jacobian is nonsingular at stationary points of the merit function. Through a set of numerical examples, we show that the proposed techniques may constitute useful options to be added in solvers for nonlinear systems of equations. © 2010 Taylor & Francis

    Globally convergent block-coordinate techniques for unconstrained optimization.

    Get PDF
    In this paper we define new classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function. More specifically, we first describe conceptual models of decomposition algorithms based on the interconnection of elementary operations performed on the block components of the variable vector. Then we characterize the elementary operations defined through a suitable line search or the global minimization in a component subspace. Using these models, we establish new results on the convergence of the nonlinear Gauss–Seidel method and we prove that this method with a two-block decomposition is globally convergent towards stationary points, even in the absence of convexity or uniqueness assumptions. In the general case of nonconvex objective function and arbitrary decomposition we define new globally convergent line-search-based schemes that may also include partial global inimizations with respect to some component. Computational aspects are discussed and, in particular, an application to a learning problem in a Radial Basis Function neural network is illustrated

    Global convergence technique for the Newton method with periodic Hessian evaluation.

    Get PDF
    The problem of globalizing the Newton method when the actual Hessian matrix is not used at every iteration is considered. A stabilization technique is studied that employs a new line search strategy for ensuring the global convergence under mild assumptions. Moreover, an implementable algorithmic scheme is proposed, where the evaluation of the second derivatives is conditioned to the behavior of the algorithm during the minimization process and the local convexity properties of the objective function. This is done in order to obtain a significant computational saving, while keeping acceptable the unavoidable degradation in convergence speed. The numerical results reported indicate that the method described may be employed advantageously in all applications where the computation of the Hessian matrix is highly time consuming

    Use of the "minimum norm" search direction in a nonmonotone version of the Gauss-Newton method.

    Get PDF

    A derivative-free algorithm for bound constrained optimization.

    Get PDF
    In this work, we propose a new globally convergent derivative-free algorithm for the minimization of a continuously differentiable function in the case that some of (or all) the variables are bounded. This algorithm investigates the local behaviour of the objective function on the feasible set by sampling it along the coordinate directions. Whenever a "suitable" descent feasible coordinate direction is detected a new point is produced by performing a linesearch along this direction. The information progressively obtained during the iterates of the algorithm can be used to build an approximation model of the objective function. The minimum of such a model is accepted if it produces an improvement of the objective function value. We also derive a bound for the limit accuracy of the algorithm in the minimization of noisy functions. Finally, we report the results of a preliminary numerical experience

    On the convergence of the block nonlinear Gauss-Seidel method under convex constraints.

    Get PDF

    On the convergence of the block nonlinear Gauss-Seidel method under convex constraints.

    Get PDF
    We give new convergence results for the block Gauss–Seidel method for problems where the feasible set is the Cartesian product of m closed convex sets, under the assumption that the sequence generated by the method has limit points. We show that the method is globally convergent for m=2 and that for m>2 convergence can be established both when the objective function f is componentwise strictly quasiconvex with respect to m−2 components and when f is pseudoconvex. Finally, we consider a proximal point modification of the method and we state convergence results without any convexity assumption on the objective functio
    • …
    corecore