7,777 research outputs found
Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
We provide new tools for worst-case performance analysis of the gradient (or
steepest descent) method of Cauchy for smooth strongly convex functions, and
Newton's method for self-concordant functions, including the case of inexact
search directions. The analysis uses semidefinite programming performance
estimation, as pioneered by Drori and Teboulle [Mathematical Programming,
145(1-2):451-482, 2014], and extends recent performance estimation results for
the method of Cauchy by the authors [Optimization Letters, 11(7), 1185-1199,
2017]. To illustrate the applicability of the tools, we demonstrate a novel
complexity analysis of short step interior point methods using inexact search
directions. As an example in this framework, we sketch how to give a rigorous
worst-case complexity analysis of a recent interior point method by Abernethy
and Hazan [PMLR, 48:2520-2528, 2016].Comment: 22 pages, 1 figure. Title of earlier version was "Worst-case
convergence analysis of gradient and Newton methods through semidefinite
programming performance estimation
Polynomial Primal-Dual Cone Affine Scaling for Semidefinite Programming
In this paper we generalize the primal--dual cone affine scaling algorithm of Sturm and Zhang to semidefinite programming. We show in this paper that the underlying ideas of the cone affine scaling algorithm can be naturely applied to semidefinite programming, resulting in a new algorithm. Compared to other primal--dual affine scaling algorithms for semidefinite programming, our algorithm enjoys the lowest computational complexity.semidefinite programming;affine scaling;primal--dual Interior point methods
Self-scaled barriers for irreducible symmetric cones
Self-scaled barrier functions are fundamental objects in the theory of
interior-point methods for linear optimization over symmetric cones, of which
linear and semidefinite programming are special cases. We are classifying all
self-scaled barriers over irreducible symmetric cones and show that these
functions are merely homothetic transformations of the universal barrier
function. Together with a decomposition theorem for self-scaled barriers this
concludes the algebraic classification theory of these functions. After
introducing the reader to the concepts relevant to the problem and tracing the
history of the subject, we start by deriving our result from first principles
in the important special case of semidefinite programming. We then generalise
these arguments to irreducible symmetric cones by invoking results from the
theory of Euclidean Jordan algebras.Comment: 12 page
Efficient Semidefinite Spectral Clustering via Lagrange Duality
We propose an efficient approach to semidefinite spectral clustering (SSC),
which addresses the Frobenius normalization with the positive semidefinite
(p.s.d.) constraint for spectral clustering. Compared with the original
Frobenius norm approximation based algorithm, the proposed algorithm can more
accurately find the closest doubly stochastic approximation to the affinity
matrix by considering the p.s.d. constraint. In this paper, SSC is formulated
as a semidefinite programming (SDP) problem. In order to solve the high
computational complexity of SDP, we present a dual algorithm based on the
Lagrange dual formalization. Two versions of the proposed algorithm are
proffered: one with less memory usage and the other with faster convergence
rate. The proposed algorithm has much lower time complexity than that of the
standard interior-point based SDP solvers. Experimental results on both UCI
data sets and real-world image data sets demonstrate that 1) compared with the
state-of-the-art spectral clustering methods, the proposed algorithm achieves
better clustering performance; and 2) our algorithm is much more efficient and
can solve larger-scale SSC problems than those standard interior-point SDP
solvers.Comment: 13 page
Worst-Case Linear Discriminant Analysis as Scalable Semidefinite Feasibility Problems
In this paper, we propose an efficient semidefinite programming (SDP)
approach to worst-case linear discriminant analysis (WLDA). Compared with the
traditional LDA, WLDA considers the dimensionality reduction problem from the
worst-case viewpoint, which is in general more robust for classification.
However, the original problem of WLDA is non-convex and difficult to optimize.
In this paper, we reformulate the optimization problem of WLDA into a sequence
of semidefinite feasibility problems. To efficiently solve the semidefinite
feasibility problems, we design a new scalable optimization method with
quasi-Newton methods and eigen-decomposition being the core components. The
proposed method is orders of magnitude faster than standard interior-point
based SDP solvers.
Experiments on a variety of classification problems demonstrate that our
approach achieves better performance than standard LDA. Our method is also much
faster and more scalable than standard interior-point SDP solvers based WLDA.
The computational complexity for an SDP with constraints and matrices of
size by is roughly reduced from to
( in our case).Comment: 14 page
Sum of squares generalizations for conic sets
In polynomial optimization problems, nonnegativity constraints are typically
handled using the sum of squares condition. This can be efficiently enforced
using semidefinite programming formulations, or as more recently proposed by
Papp and Yildiz [18], using the sum of squares cone directly in a nonsymmetric
interior point algorithm. Beyond nonnegativity, more complicated polynomial
constraints (in particular, generalizations of the positive semidefinite,
second order and -norm cones) can also be modeled through structured
sum of squares programs. We take a different approach and propose using more
specialized polynomial cones instead. This can result in lower dimensional
formulations, more efficient oracles for interior point methods, or
self-concordant barriers with smaller parameters. In most cases, these
algorithmic advantages also translate to faster solving times in practice
- …