18,067 research outputs found

    Online semi-parametric learning for inverse dynamics modeling

    Full text link
    This paper presents a semi-parametric algorithm for online learning of a robot inverse dynamics model. It combines the strength of the parametric and non-parametric modeling. The former exploits the rigid body dynamics equa- tion, while the latter exploits a suitable kernel function. We provide an extensive comparison with other methods from the literature using real data from the iCub humanoid robot. In doing so we also compare two different techniques, namely cross validation and marginal likelihood optimization, for estimating the hyperparameters of the kernel function

    Discrete-time multi-scale systems

    Get PDF
    We introduce multi-scale filtering by the way of certain double convolution systems. We prove stability theorems for these systems and make connections with function theory in the poly-disc. Finally, we compare the framework developed here with the white noise space framework, within which a similar class of double convolution systems has been defined earlier

    Sparse approximation of multilinear problems with applications to kernel-based methods in UQ

    Full text link
    We provide a framework for the sparse approximation of multilinear problems and show that several problems in uncertainty quantification fit within this framework. In these problems, the value of a multilinear map has to be approximated using approximations of different accuracy and computational work of the arguments of this map. We propose and analyze a generalized version of Smolyak's algorithm, which provides sparse approximation formulas with convergence rates that mitigate the curse of dimension that appears in multilinear approximation problems with a large number of arguments. We apply the general framework to response surface approximation and optimization under uncertainty for parametric partial differential equations using kernel-based approximation. The theoretical results are supplemented by numerical experiments

    On the class SI of J-contractive functions intertwining solutions of linear differential equations

    Full text link
    In the PhD thesis of the second author under the supervision of the third author was defined the class SI of J-contractive functions, depending on a parameter and arising as transfer functions of overdetermined conservative 2D systems invariant in one direction. In this paper we extend and solve in the class SI, a number of problems originally set for the class SC of functions contractive in the open right-half plane, and unitary on the imaginary line with respect to some preassigned signature matrix J. The problems we consider include the Schur algorithm, the partial realization problem and the Nevanlinna-Pick interpolation problem. The arguments rely on a correspondence between elements in a given subclass of SI and elements in SC. Another important tool in the arguments is a new result pertaining to the classical tangential Schur algorithm.Comment: 46 page

    Large-scale Nonlinear Variable Selection via Kernel Random Features

    Full text link
    We propose a new method for input variable selection in nonlinear regression. The method is embedded into a kernel regression machine that can model general nonlinear functions, not being a priori limited to additive models. This is the first kernel-based variable selection method applicable to large datasets. It sidesteps the typical poor scaling properties of kernel methods by mapping the inputs into a relatively low-dimensional space of random features. The algorithm discovers the variables relevant for the regression task together with learning the prediction model through learning the appropriate nonlinear random feature maps. We demonstrate the outstanding performance of our method on a set of large-scale synthetic and real datasets.Comment: Final version for proceedings of ECML/PKDD 201
    corecore