381 research outputs found

    Contracting Nonlinear Observers: Convex Optimization and Learning from Data

    Full text link
    A new approach to design of nonlinear observers (state estimators) is proposed. The main idea is to (i) construct a convex set of dynamical systems which are contracting observers for a particular system, and (ii) optimize over this set for one which minimizes a bound on state-estimation error on a simulated noisy data set. We construct convex sets of continuous-time and discrete-time observers, as well as contracting sampled-data observers for continuous-time systems. Convex bounds for learning are constructed using Lagrangian relaxation. The utility of the proposed methods are verified using numerical simulation.Comment: conference submissio

    Metric-Free Natural Gradient for Joint-Training of Boltzmann Machines

    Full text link
    This paper introduces the Metric-Free Natural Gradient (MFNG) algorithm for training Boltzmann Machines. Similar in spirit to the Hessian-Free method of Martens [8], our algorithm belongs to the family of truncated Newton methods and exploits an efficient matrix-vector product to avoid explicitely storing the natural gradient metric LL. This metric is shown to be the expected second derivative of the log-partition function (under the model distribution), or equivalently, the variance of the vector of partial derivatives of the energy function. We evaluate our method on the task of joint-training a 3-layer Deep Boltzmann Machine and show that MFNG does indeed have faster per-epoch convergence compared to Stochastic Maximum Likelihood with centering, though wall-clock performance is currently not competitive
    • …
    corecore