299 research outputs found
Stochastic approximation of score functions for Gaussian processes
We discuss the statistical properties of a recently introduced unbiased
stochastic approximation to the score equations for maximum likelihood
calculation for Gaussian processes. Under certain conditions, including bounded
condition number of the covariance matrix, the approach achieves storage
and nearly computational effort per optimization step, where is the
number of data sites. Here, we prove that if the condition number of the
covariance matrix is bounded, then the approximate score equations are nearly
optimal in a well-defined sense. Therefore, not only is the approximation
efficient to compute, but it also has comparable statistical properties to the
exact maximum likelihood estimates. We discuss a modification of the stochastic
approximation in which design elements of the stochastic terms mimic patterns
from a factorial design. We prove these designs are always at least as
good as the unstructured design, and we demonstrate through simulation that
they can produce a substantial improvement over random designs. Our findings
are validated by numerical experiments on simulated data sets of up to 1
million observations. We apply the approach to fit a space-time model to over
80,000 observations of total column ozone contained in the latitude band
-N during April 2012.Comment: Published in at http://dx.doi.org/10.1214/13-AOAS627 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Differentiable Frank-Wolfe Optimization Layer
Differentiable optimization has received a significant amount of attention
due to its foundational role in the domain of machine learning based on neural
networks. The existing methods leverages the optimality conditions and implicit
function theorem to obtain the Jacobian matrix of the output, which increases
the computational cost and limits the application of differentiable
optimization. In addition, some non-differentiable constraints lead to more
challenges when using prior differentiable optimization layers. This paper
proposes a differentiable layer, named Differentiable Frank-Wolfe Layer
(DFWLayer), by rolling out the Frank-Wolfe method, a well-known optimization
algorithm which can solve constrained optimization problems without projections
and Hessian matrix computations, thus leading to a efficient way of dealing
with large-scale problems. Theoretically, we establish a bound on the
suboptimality gap of the DFWLayer in the context of l1-norm constraints.
Experimental assessments demonstrate that the DFWLayer not only attains
competitive accuracy in solutions and gradients but also consistently adheres
to constraints. Moreover, it surpasses the baselines in both forward and
backward computational speeds
MATMPC - A MATLAB Based Toolbox for Real-time Nonlinear Model Predictive Control
In this paper we introduce MATMPC, an open source software built in MATLAB
for nonlinear model predictive control (NMPC). It is designed to facilitate
modelling, controller design and simulation for a wide class of NMPC
applications. MATMPC has a number of algorithmic modules, including automatic
differentiation, direct multiple shooting, condensing, linear quadratic program
(QP) solver and globalization. It also supports a unique Curvature-like Measure
of Nonlinearity (CMoN) MPC algorithm. MATMPC has been designed to provide
state-of-the-art performance while making the prototyping easy, also with
limited programming knowledge. This is achieved by writing each module directly
in MATLAB API for C. As a result, MATMPC modules can be compiled into MEX
functions with performance comparable to plain C/C++ solvers. MATMPC has been
successfully used in operating systems including WINDOWS, LINUX AND OS X.
Selected examples are shown to highlight the effectiveness of MATMPC
- …