7,305 research outputs found

    Identifiability and data-adaptive RKHS Tikhonov regularization in nonparametric learning problems

    Get PDF
    We provide an identifiability analysis for the learning problems (1) nonparametric learning of kernels in operators and (2) unsupervised learning of observation functions in state space models (SSMs). We show that in either case the function space of identifiability (FSOI) from the quadratic loss functional is the closure of a system-intrinsic data-adaptive reproducing kernel Hilbert space (SIDA-RKHS). We introduce a new method, the Data-Adaptive RKHS Tikhonov Regularization method (DARTR). The regularized estimator is robust to noise and converges as data refines. The effectiveness of DARTR is demonstrated through the following problems (1) nonparametric learning of kernels in linear/nonlinear/nonlocal operators and (2) homogenization of wave propagation in meta-material. We introduce a nonparametric generalized moment method to estimate non-invertible observation functions in nonlinear SSMs. Numerical results shows that the first two moments and temporal correlations, along with upper and lower bounds, can identify functions ranging from piecewise polynomials to smooth functions. The limitations, such as non-identifiability due to symmetry and stationary, are also discussed

    Stochastic expansions using continuous dictionaries: L\'{e}vy adaptive regression kernels

    Get PDF
    This article describes a new class of prior distributions for nonparametric function estimation. The unknown function is modeled as a limit of weighted sums of kernels or generator functions indexed by continuous parameters that control local and global features such as their translation, dilation, modulation and shape. L\'{e}vy random fields and their stochastic integrals are employed to induce prior distributions for the unknown functions or, equivalently, for the number of kernels and for the parameters governing their features. Scaling, shape, and other features of the generating functions are location-specific to allow quite different function properties in different parts of the space, as with wavelet bases and other methods employing overcomplete dictionaries. We provide conditions under which the stochastic expansions converge in specified Besov or Sobolev norms. Under a Gaussian error model, this may be viewed as a sparse regression problem, with regularization induced via the L\'{e}vy random field prior distribution. Posterior inference for the unknown functions is based on a reversible jump Markov chain Monte Carlo algorithm. We compare the L\'{e}vy Adaptive Regression Kernel (LARK) method to wavelet-based methods using some of the standard test functions, and illustrate its flexibility and adaptability in nonstationary applications.Comment: Published in at http://dx.doi.org/10.1214/11-AOS889 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Uncovering Causality from Multivariate Hawkes Integrated Cumulants

    Get PDF
    We design a new nonparametric method that allows one to estimate the matrix of integrated kernels of a multivariate Hawkes process. This matrix not only encodes the mutual influences of each nodes of the process, but also disentangles the causality relationships between them. Our approach is the first that leads to an estimation of this matrix without any parametric modeling and estimation of the kernels themselves. A consequence is that it can give an estimation of causality relationships between nodes (or users), based on their activity timestamps (on a social network for instance), without knowing or estimating the shape of the activities lifetime. For that purpose, we introduce a moment matching method that fits the third-order integrated cumulants of the process. We show on numerical experiments that our approach is indeed very robust to the shape of the kernels, and gives appealing results on the MemeTracker database
    • …
    corecore