10 research outputs found
Semi-proximal Mirror-Prox for Nonsmooth Composite Minimization
We propose a new first-order optimisation algorithm to solve high-dimensional
non-smooth composite minimisation problems. Typical examples of such problems
have an objective that decomposes into a non-smooth empirical risk part and a
non-smooth regularisation penalty. The proposed algorithm, called Semi-Proximal
Mirror-Prox, leverages the Fenchel-type representation of one part of the
objective while handling the other part of the objective via linear
minimization over the domain. The algorithm stands in contrast with more
classical proximal gradient algorithms with smoothing, which require the
computation of proximal operators at each iteration and can therefore be
impractical for high-dimensional problems. We establish the theoretical
convergence rate of Semi-Proximal Mirror-Prox, which exhibits the optimal
complexity bounds, i.e. , for the number of calls to linear
minimization oracle. We present promising experimental results showing the
interest of the approach in comparison to competing methods
Hessian barrier algorithms for linearly constrained optimization problems
In this paper, we propose an interior-point method for linearly constrained
optimization problems (possibly nonconvex). The method - which we call the
Hessian barrier algorithm (HBA) - combines a forward Euler discretization of
Hessian Riemannian gradient flows with an Armijo backtracking step-size policy.
In this way, HBA can be seen as an alternative to mirror descent (MD), and
contains as special cases the affine scaling algorithm, regularized Newton
processes, and several other iterative solution methods. Our main result is
that, modulo a non-degeneracy condition, the algorithm converges to the
problem's set of critical points; hence, in the convex case, the algorithm
converges globally to the problem's minimum set. In the case of linearly
constrained quadratic programs (not necessarily convex), we also show that the
method's convergence rate is for some
that depends only on the choice of kernel function (i.e., not on the problem's
primitives). These theoretical results are validated by numerical experiments
in standard non-convex test functions and large-scale traffic assignment
problems.Comment: 27 pages, 6 figure
Generalized self-concordant Hessian-barrier algorithms
Many problems in statistical learning, imaging, and computer vision involve the optimization of a non-convex objective function with singularities at the boundary of the feasible set. For such challenging instances, we develop a new interior-point technique building on the Hessian-barrier algorithm recently introduced in Bomze, Mertikopoulos, Schachinger and Staudigl, [SIAM J. Opt. 2019 29(3), pp. 2100-2127], where the Riemannian metric is induced by a generalized selfconcordant function. This class of functions is sufficiently general to include most of the commonly used barrier functions in the literature of interior point methods. We prove global convergence to an approximate stationary point of the method, and in cases where the feasible set admits an easily computable self-concordant barrier, we verify worst-case optimal iteration complexity of the method. Applications in non-convex statistical estimation and Lp-minimization are discussed to given the efficiency of the method
Generalized Self-concordant Hessian-barrier algorithms
Many problems in statistical learning, imaging, and computer vision involve
the optimization of a non-convex objective function with singularities at the
boundary of the feasible set. For such challenging instances, we develop a new
interior-point technique building on the Hessian-barrier algorithm recently
introduced in Bomze, Mertikopoulos, Schachinger and Staudigl, [SIAM J. Opt.
2019 29(3), pp. 2100-2127], where the Riemannian metric is induced by a
generalized self-concordant function. This class of functions is sufficiently
general to include most of the commonly used barrier functions in the
literature of interior point methods. We prove global convergence to an
approximate stationary point of the method, and in cases where the feasible set
admits an easily computable self-concordant barrier, we verify worst-case
optimal iteration complexity of the method. Applications in non-convex
statistical estimation and -minimization are discussed to given the
efficiency of the method
2018 Faculty Excellence Showcase, AFIT Graduate School of Engineering & Management
Excerpt:
As an academic institution, we strive to meet and exceed the expectations for graduate programs and laud our values and contributions to the academic community. At the same time, we must recognize, appreciate, and promote the unique non-academic values and accomplishments that our faculty team brings to the national defense, which is a priority of the Federal Government. In this respect, through our diverse and multi-faceted contributions, our faculty, as a whole, excel, not only along the metrics of civilian academic expectations, but also along the metrics of military requirements, and national priorities
Academic Year 2019-2020 Faculty Excellence Showcase, AFIT Graduate School of Engineering & Management
An excerpt from the Dean\u27s Message:
There is no place like the Air Force Institute of Technology (AFIT). There is no academic group like AFIT’s Graduate School of Engineering and Management. Although we run an educational institution similar to many other institutions of higher learning, we are different and unique because of our defense-focused graduate-research-based academic programs. Our programs are designed to be relevant and responsive to national defense needs. Our programs are aligned with the prevailing priorities of the US Air Force and the US Department of Defense. Our faculty team has the requisite critical mass of service-tested faculty members. The unique composition of pure civilian faculty, military faculty, and service-retired civilian faculty makes AFIT truly unique, unlike any other academic institution anywhere