23 research outputs found
Nonsmoothness in Machine Learning: specific structure, proximal identification, and applications
Nonsmoothness is often a curse for optimization; but it is sometimes a
blessing, in particular for applications in machine learning. In this paper, we
present the specific structure of nonsmooth optimization problems appearing in
machine learning and illustrate how to leverage this structure in practice, for
compression, acceleration, or dimension reduction. We pay a special attention
to the presentation to make it concise and easily accessible, with both simple
examples and general results
Generic optimality conditions for semi-algebraic convex programs
We consider linear optimization over a nonempty convex semi-algebraic feasible region F. Semidefinite programming is an example. If F is compact, then for almost every linear objective there is a unique optimal solution, lying on a unique \active" manifold, around which F is \partly smooth", and the second-order sufficient conditions hold. Perturbing the objective results in smooth variation of the optimal solution. The active manifold consists, locally, of these perturbed optimal solutions; it is independent of the representation of F, and is eventually identified by a variety of iterative algorithms such as proximal and projected gradient schemes. These results extend to unbounded sets F