21 research outputs found
A range division and contraction approach for nonconvex quadratic program with quadratic constraints
Dictionary learning for fast classification based on soft-thresholding.
Classifiers based on sparse representations have recently been shown to
provide excellent results in many visual recognition and classification tasks.
However, the high cost of computing sparse representations at test time is a
major obstacle that limits the applicability of these methods in large-scale
problems, or in scenarios where computational power is restricted. We consider
in this paper a simple yet efficient alternative to sparse coding for feature
extraction. We study a classification scheme that applies the soft-thresholding
nonlinear mapping in a dictionary, followed by a linear classifier. A novel
supervised dictionary learning algorithm tailored for this low complexity
classification architecture is proposed. The dictionary learning problem, which
jointly learns the dictionary and linear classifier, is cast as a difference of
convex (DC) program and solved efficiently with an iterative DC solver. We
conduct experiments on several datasets, and show that our learning algorithm
that leverages the structure of the classification problem outperforms generic
learning procedures. Our simple classifier based on soft-thresholding also
competes with the recent sparse coding classifiers, when the dictionary is
learned appropriately. The adopted classification scheme further requires less
computational time at the testing stage, compared to other classifiers. The
proposed scheme shows the potential of the adequately trained soft-thresholding
mapping for classification and paves the way towards the development of very
efficient classification methods for vision problems
Outcome-Based Branch and Bound Algorithm for Optimization over the Efficient Set and Its Application
Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations
We introduce a proximal bundle method for the numerical minimization of a nonsmooth difference-of-convex (DC) function. Exploiting some classic ideas coming from cutting-plane approaches for the convex case, we iteratively build two separate piecewise-affine approximations of the component functions, grouping the corresponding information in two separate bundles. In the bundle of the first component, only information related to points close to the current iterate are maintained, while the second bundle only refers to a global model of the corresponding component function. We combine the two convex piecewise-affine approximations, and generate a DC piecewise-affine model, which can also be seen as the pointwise maximum of several concave piecewise-affine functions. Such a nonconvex model is locally approximated by means of an auxiliary quadratic program, whose solution is used to certify approximate criticality or to generate a descent search-direction, along with a predicted reduction, that is next explored in a line-search setting. To improve the approximation properties at points that are far from the current iterate a supplementary quadratic program is also introduced to generate an alternative more promising search-direction. We discuss the main convergence issues of the line-search based proximal bundle method, and provide computational results on a set of academic benchmark test problems. © 2017, Springer Science+Business Media, LLC