286 research outputs found

    Inverse Density as an Inverse Problem: The Fredholm Equation Approach

    Full text link
    In this paper we address the problem of estimating the ratio qp\frac{q}{p} where pp is a density function and qq is another density, or, more generally an arbitrary function. Knowing or approximating this ratio is needed in various problems of inference and integration, in particular, when one needs to average a function with respect to one probability distribution, given a sample from another. It is often referred as {\it importance sampling} in statistical inference and is also closely related to the problem of {\it covariate shift} in transfer learning as well as to various MCMC methods. It may also be useful for separating the underlying geometry of a space, say a manifold, from the density function defined on it. Our approach is based on reformulating the problem of estimating qp\frac{q}{p} as an inverse problem in terms of an integral operator corresponding to a kernel, and thus reducing it to an integral equation, known as the Fredholm problem of the first kind. This formulation, combined with the techniques of regularization and kernel methods, leads to a principled kernel-based framework for constructing algorithms and for analyzing them theoretically. The resulting family of algorithms (FIRE, for Fredholm Inverse Regularized Estimator) is flexible, simple and easy to implement. We provide detailed theoretical analysis including concentration bounds and convergence rates for the Gaussian kernel in the case of densities defined on Rd\R^d, compact domains in Rd\R^d and smooth dd-dimensional sub-manifolds of the Euclidean space. We also show experimental results including applications to classification and semi-supervised learning within the covariate shift framework and demonstrate some encouraging experimental comparisons. We also show how the parameters of our algorithms can be chosen in a completely unsupervised manner.Comment: Fixing a few typos in last versio

    Non Parametric Instrumental Regression

    Get PDF
    The focus of the paper is the nonparametric estimation of an instrumental regression function ϕ defined by conditional moment restrictions stemming from a structural econometric model: E [Y − ϕ (Z) | W] = 0, and involving endogenous variables Y and Z and instruments W . The function ϕ is the solution of an ill-posed inverse problem and we propose an estimation procedure based on Tikhonov regularization. The paper analyses identification and overidentification of this model and presents asymptotic properties of the estimated nonparametric instrumental regression function.

    An adaptive RKHS regularization for Fredholm integral equations

    Full text link
    Regularization is a long-standing challenge for ill-posed linear inverse problems, and a prototype is the Fredholm integral equation of the first kind. We introduce a practical RKHS regularization algorithm adaptive to the discrete noisy measurement data and the underlying linear operator. This RKHS arises naturally in a variational approach, and its closure is the function space in which we can identify the true solution. We prove that the RKHS-regularized estimator has a mean-square error converging linearly as the noise scale decreases, with a multiplicative factor smaller than the commonly-used L2L^2-regularized estimator. Furthermore, numerical results demonstrate that the RKHS-regularizer significantly outperforms L2L^2-regularizer when either the noise level decays or when the observation mesh refines.Comment: 18 page

    IDENTIFICATION AND ESTIMATION OF NONPARAMETRIC STRUCTURAL

    Get PDF
    This paper concerns a new statistical approach to instrumental variables (IV) method for nonparametric structural models with additive errors. A general identifying condition of the model is proposed, based on richness of the space generated by marginal discretizations of joint density functions. For consistent estimation, we develop statistical regularization theory to solve a random Fredholm integral equation of the first kind. A\ minimal set of conditions are given for consistency of a general regularization method. Using an abstract smoothness condition, we derive some optimal bounds, given the accuracies of preliminary estimates, and show the convergence rates of various regularization methods, including (the ordinary/iterated/generalized) Tikhonov and Showalter's methods. An application of the general regularization theory is discussed with a focus on a kernel smoothing method. We show an exact closed form, as well as the optimal convergence rate, of the kernel IV estimates of various regularization methods. The finite sample properties of the estimates are investigated via a small-scale Monte Carlo experimentNonparametric Strucutral Models, IV estimation, Statistical inverse problems

    Non Parametric Instrumental Regression

    Get PDF
    The focus of the paper is the nonparametric estimation of an instrumental regression function ϕ defined by conditional moment restrictions stemming from a structural econometric model: E [Y − ϕ (Z) | W] = 0, and involving endogenous variables Y and Z and instruments W . The function ϕ is the solution of an ill-posed inverse problem and we propose an estimation procedure based on Tikhonov regularization. The paper analyses identification and overidentification of this model and presents asymptotic properties of the estimated nonparametric instrumental regression function

    Regularization of statistical inverse problems and the Bakushinskii veto

    Get PDF
    In the deterministic context Bakushinskii's theorem excludes the existence of purely data driven convergent regularization for ill-posed problems. We will prove in the present work that in the statistical setting we can either construct a counter example or develop an equivalent formulation depending on the considered class of probability distributions. Hence, Bakushinskii's theorem does not generalize to the statistical context, although this has often been assumed in the past. To arrive at this conclusion, we will deduce from the classic theory new concepts for a general study of statistical inverse problems and perform a systematic clarification of the key ideas of statistical regularization.Comment: 20 page

    Asymptotic Normality of Support Vector Machine Variants and Other Regularized Kernel Methods

    Full text link
    In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions, it is shown that the difference between the estimator, i.e.\ the empirical SVM, and the theoretical SVM is asymptotically normal with rate n\sqrt{n}. That is, the standardized difference converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional is suitably Hadamard-differentiable
    corecore