8,970 research outputs found
Optimal designs for rational function regression
We consider optimal non-sequential designs for a large class of (linear and
nonlinear) regression models involving polynomials and rational functions with
heteroscedastic noise also given by a polynomial or rational weight function.
The proposed method treats D-, E-, A-, and -optimal designs in a
unified manner, and generates a polynomial whose zeros are the support points
of the optimal approximate design, generalizing a number of previously known
results of the same flavor. The method is based on a mathematical optimization
model that can incorporate various criteria of optimality and can be solved
efficiently by well established numerical optimization methods. In contrast to
previous optimization-based methods proposed for similar design problems, it
also has theoretical guarantee of its algorithmic efficiency; in fact, the
running times of all numerical examples considered in the paper are negligible.
The stability of the method is demonstrated in an example involving high degree
polynomials. After discussing linear models, applications for finding locally
optimal designs for nonlinear regression models involving rational functions
are presented, then extensions to robust regression designs, and trigonometric
regression are shown. As a corollary, an upper bound on the size of the support
set of the minimally-supported optimal designs is also found. The method is of
considerable practical importance, with the potential for instance to impact
design software development. Further study of the optimality conditions of the
main optimization model might also yield new theoretical insights.Comment: 25 pages. Previous version updated with more details in the theory
and additional example
Definable Ellipsoid Method, Sums-of-Squares Proofs, and the Isomorphism Problem
The ellipsoid method is an algorithm that solves the (weak) feasibility and
linear optimization problems for convex sets by making oracle calls to their
(weak) separation problem. We observe that the previously known method for
showing that this reduction can be done in fixed-point logic with counting
(FPC) for linear and semidefinite programs applies to any family of explicitly
bounded convex sets. We use this observation to show that the exact feasibility
problem for semidefinite programs is expressible in the infinitary version of
FPC. As a corollary we get that, for the isomorphism problem, the
Lasserre/Sums-of-Squares semidefinite programming hierarchy of relaxations
collapses to the Sherali-Adams linear programming hierarchy, up to a small loss
in the degree
Definable ellipsoid method, sums-of-squares proofs, and the isomorphism problem
The ellipsoid method is an algorithm that solves the (weak) feasibility and linear optimization problems for convex sets by making oracle calls to their (weak) separation problem. We observe that the previously known method for showing that this reduction can be done in fixed-point logic with counting (FPC) for linear and semidefinite programs applies to any family of explicitly bounded convex sets. We use this observation to show that the exact feasibility problem for semidefinite programs is expressible in the infinitary version of FPC. As a corollary we get that, for the graph isomorphism problem, the Lasserre/Sums-of-Squares semidefinite programming hierarchy of relaxations collapses to the Sherali-Adams linear programming hierarchy, up to a small loss in the degree. © 2018 ACM.Peer ReviewedPostprint (author's final draft
On semidefinite representations of plane quartics
This note focuses on the problem of representing convex sets as projections
of the cone of positive semidefinite matrices, in the particular case of sets
generated by bivariate polynomials of degree four. Conditions are given for the
convex hull of a plane quartic to be exactly semidefinite representable with at
most 12 lifting variables. If the quartic is rationally parametrizable, an
exact semidefinite representation with 2 lifting variables can be obtained.
Various numerical examples illustrate the techniques and suggest further
research directions
Hyperbolic Polynomials and Generalized Clifford Algebras
We consider the problem of realizing hyperbolicity cones as spectrahedra,
i.e. as linear slices of cones of positive semidefinite matrices. The
generalized Lax conjecture states that this is always possible. We use
generalized Clifford algebras for a new approach to the problem. Our main
result is that if -1 is not a sum of hermitian squares in the Clifford algebra
of a hyperbolic polynomial, then its hyperbolicity cone is spectrahedral. Our
result also has computational applications, since this sufficient condition can
be checked with a single semidefinite program
Design of First-Order Optimization Algorithms via Sum-of-Squares Programming
In this paper, we propose a framework based on sum-of-squares programming to
design iterative first-order optimization algorithms for smooth and strongly
convex problems. Our starting point is to develop a polynomial matrix
inequality as a sufficient condition for exponential convergence of the
algorithm. The entries of this matrix are polynomial functions of the unknown
parameters (exponential decay rate, stepsize, momentum coefficient, etc.). We
then formulate a polynomial optimization, in which the objective is to optimize
the exponential decay rate over the parameters of the algorithm. Finally, we
use sum-of-squares programming as a tractable relaxation of the proposed
polynomial optimization problem. We illustrate the utility of the proposed
framework by designing a first-order algorithm that shares the same structure
as Nesterov's accelerated gradient method
- …