761 research outputs found
A geodesic interior-point method for linear optimization over symmetric cones
We develop a new interior-point method for symmetric-cone optimization, a
common generalization of linear, second-order-cone, and semidefinite
programming. Our key idea is updating iterates with a geodesic of the cone
instead of the kernel of the linear constraints. This approach yields a
primal-dual-symmetric, scale-invariant, and line-search-free algorithm that
uses just half the variables of a standard primal-dual method. With elementary
arguments, we establish polynomial-time convergence matching the standard
square-root-n bound. Finally, we prove global convergence of a long-step
variant and compare the approaches computationally. For linear programming, our
algorithms reduce to central-path tracking in the log domain
Continuous Multiclass Labeling Approaches and Algorithms
We study convex relaxations of the image labeling problem on a continuous
domain with regularizers based on metric interaction potentials. The generic
framework ensures existence of minimizers and covers a wide range of
relaxations of the originally combinatorial problem. We focus on two specific
relaxations that differ in flexibility and simplicity -- one can be used to
tightly relax any metric interaction potential, while the other one only covers
Euclidean metrics but requires less computational effort. For solving the
nonsmooth discretized problem, we propose a globally convergent
Douglas-Rachford scheme, and show that a sequence of dual iterates can be
recovered in order to provide a posteriori optimality bounds. In a quantitative
comparison to two other first-order methods, the approach shows competitive
performance on synthetical and real-world images. By combining the method with
an improved binarization technique for nonstandard potentials, we were able to
routinely recover discrete solutions within 1%--5% of the global optimum for
the combinatorial image labeling problem
A study of search directions in primal-dual interior-point methods for semidefinite programming
A study of search directions in primal-dual interior-point methods for semidefinite programmin
On the Nesterov-Todd Direction in Semidefinite Programming
On the Nesterov-Todd Direction in Semidefinite Programmin
Gordon's inequality and condition numbers in conic optimization
The probabilistic analysis of condition numbers has traditionally been
approached from different angles; one is based on Smale's program in complexity
theory and features integral geometry, while the other is motivated by
geometric functional analysis and makes use of the theory of Gaussian
processes. In this note we explore connections between the two approaches in
the context of the biconic homogeneous feasiblity problem and the condition
numbers motivated by conic optimization theory. Key tools in the analysis are
Slepian's and Gordon's comparision inequalities for Gaussian processes,
interpreted as monotonicity properties of moment functionals, and their
interplay with ideas from conic integral geometry
Interior-point algorithms for convex optimization based on primal-dual metrics
We propose and analyse primal-dual interior-point algorithms for convex
optimization problems in conic form. The families of algorithms we analyse are
so-called short-step algorithms and they match the current best iteration
complexity bounds for primal-dual symmetric interior-point algorithm of
Nesterov and Todd, for symmetric cone programming problems with given
self-scaled barriers. Our results apply to any self-concordant barrier for any
convex cone. We also prove that certain specializations of our algorithms to
hyperbolic cone programming problems (which lie strictly between symmetric cone
programming and general convex optimization problems in terms of generality)
can take advantage of the favourable special structure of hyperbolic barriers.
We make new connections to Riemannian geometry, integrals over operator spaces,
Gaussian quadrature, and strengthen the connection of our algorithms to
quasi-Newton updates and hence first-order methods in general.Comment: 36 page
- ā¦