4,933 research outputs found

    A Reproducing Kernel Perspective of Smoothing Spline Estimators

    Get PDF
    Spline functions have a long history as smoothers of noisy time series data, and several equivalent kernel representations have been proposed in terms of the Green's function solving the related boundary value problem. In this study we make use of the reproducing kernel property of the Green's function to obtain an hierarchy of time-invariant spline kernels of different order. The reproducing kernels give a good representation of smoothing splines for medium and long length filters, with a better performance of the asymmetric weights in terms of signal passing, noise suppression and revisions. Empirical comparisons of time-invariant filters are made with the classical non linear ones. The former are shown to loose part of their optimal properties when we fixed the length of the filter according to the noise to signal ratio as done in nonparametric seasonal adjustment procedures.equivalent kernels, nonparametric regression, Hilbert spaces, time series filtering, spectral properties

    A volume-averaged nodal projection method for the Reissner-Mindlin plate model

    Get PDF
    We introduce a novel meshfree Galerkin method for the solution of Reissner-Mindlin plate problems that is written in terms of the primitive variables only (i.e., rotations and transverse displacement) and is devoid of shear-locking. The proposed approach uses linear maximum-entropy approximations and is built variationally on a two-field potential energy functional wherein the shear strain, written in terms of the primitive variables, is computed via a volume-averaged nodal projection operator that is constructed from the Kirchhoff constraint of the three-field mixed weak form. The stability of the method is rendered by adding bubble-like enrichment to the rotation degrees of freedom. Some benchmark problems are presented to demonstrate the accuracy and performance of the proposed method for a wide range of plate thicknesses

    Quantum machine learning: a classical perspective

    Get PDF
    Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets are motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed-up classical machine learning algorithms. Here we review the literature in quantum machine learning and discuss perspectives for a mixed readership of classical machine learning and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in machine learning are identified as promising directions for the field. Practical questions, like how to upload classical data into quantum form, will also be addressed.Comment: v3 33 pages; typos corrected and references adde
    corecore