14 research outputs found

    A new approach to estimator selection

    No full text
    International audienceIn the framework of an abstract statistical model, we discuss how to use the solution of one estimation problem (Problem A) in order to construct an estimator in another, completely different, Problem B. As a solution of Problem A we understand a data-driven selection from a given family of estimators A(H) = {(A) over cap (h), h is an element of H} and establishing for the selected estimator so-called oracle inequality. If (h) over cap is an element of H is the selected parameter and B(H) = {(B) over cap (h), h is an element of H} is an estimator's collection built in Problem B, we suggest to use the estimator (B) over cap ((h) over cap). We present very general selection rule led to selector (h) over cap and find conditions under which the estimator (B) over cap ((h) over cap) is reasonable. Our approach is illustrated by several examples related to adaptive estimation

    Adaptive non-parametric estimation of smooth multivariate functions

    No full text
    Adaptive pointwise estimation of smooth functions fx in Rd is studied in the white Gaussian noise model of a given intensity It is assumed that the Fourier transform of f belongs to a large class of rapidly vanishing functions but is otherwise unknown Optimal adaptation in higher dimensions presents several challenges First the number of essentially dierent estimates having a given variance S increases polynomially as Sd Second the set of possible estimators totally ordered when d becomes only partially ordered when d We demonstrate how these challenges can be met The rst one is to be matched by a meticulous choice of the estimators net The key to solving the second problem lies in a new method of spectral majorants introduced in this paper Extending our earlier approach used in we restrict ourselves to a family of estimators rateecient in an obeat case of partially parametric functional classes A proposed adaptive procedure is shown to be asymptotically minimax simultane ously for any ample regular nonparametric family of underlying functions

    Test for Symmetry of Regression Curves

    No full text
    The minimax properties of a test verifying a symmetry of an unknown regression function f from n independent observations are studied. The underlying design is assumed to be random and independent of the noise in observations. The function f belongs to a ball in a Holder space of regularity fi. The null hypothesis accepts that f is symmetric. We test this hypothesis versus the alternative that the L 2 distance from f to the set of symmetric functions exceeds p r n =2. As shown these hypotheses can be tested consistently when r n = O(n \Gamma4fi=(4fi+1) ): 1 Introduction As far as we know, the problem of testing symmetry of a curve in the nonparametric estimation has not been treated in the literature, except the case of density. A test for symmetry of the density in i.i.d. sampling was proposed by Y. Ingster (1984). This paper can be considered as the first step in the research program, and as the development of the mathematical tools to approach to the real problem of interest:..

    Optimal Pointwise Adaptive Methods In Nonparametric Estimation

    No full text
    . The problem of optimal adaptive estimation of a function at a given point from noisy data is considered. Two procedures are proved to be asymptotically optimal for different settings. First we study the problem of bandwidth selection for nonparametric pointwise kernel estimation with a given kernel. We propose a bandwidth selection procedure and prove its optimality in the asymptotic sense. Moreover, this optimality is stated not only among kernel estimators with a variable kernel. The resulting estimator is optimal among all feasible estimators. The important feature of this procedure is that no prior information is used about smoothness properties of the estimated function i.e. the procedure is completely adaptive and "works" for the class of all functions. With it the attainable accuracy of estimation depends on the function itself and it is expressed in terms of "ideal" bandwidth corresponding to this function. The second procedure can be considered as a specification of the firs..

    Asymptotically Exact Nonparametric Hypothesis Testing in Sup-Norm and At a Fixed Point

    No full text
    For the signal in Gaussian white noise model we consider the problem of testing the hypothesis H 0 : f j 0; (the signal f is zero) against the nonparametric alternative H 1 : f 2 " where " is a set of functions on R 1 of the form " = ff : f 2 F ; '(f) C/ " g: Here F is a Holder or Sobolev class of functions, '(f) is either the sup-norm of f or the value of f at a fixed point, C ? 0 is a constant, / " is the minimax rate of testing and " ! 0 is the asymptotic parameter of the model. We find exact separation constants C ? 0 such that a test with the given summarized asymptotic errors of first and second type is possible for C ? C and is not possible for C ! C . We propose asymptotically minimax test statistics. 1 Introduction Consider the stochastic process Y (t) defined on [0; 1] and satisfying the stochastic differential equation dY (t) = f(t)dt + "dW (t); (1) where W (t) is the standard Wiener process on [0; 1], f is an unknown real-valued function and 0 ! " ! 1. Sup..

    Optimal pointwise adaptive methods in nonparametric estimation

    No full text
    The problem of optimal adaptive estimation of a function at a given point from noisy data is considered. Two procedures are proved to be asymptotically optimal for different settings. First we study the problem of bandwidth selection for nonparametric pointwise kernel estimation with a given kernel. We propose a bandwidth selection procedure and prove its optimality in the asymptotic sense. Moreover, this optimality is stated not only among kernel estimators with a variable kernel. The resulting estimators is optimal among all feasible estimators. The important feature of this procedure is that no prior information is used about smoothness properties of the estimated function i.e. the procedure is completely adaptive and 'works' for the class of all functions. With it the attainable accuray of estimation depends on the function itself and it is expressed in terms of 'ideal' bandwidth corresponding to this function. The second procedure can be considered as a specification of the first one under the qualitative assumption that the function to be estimated belongs to some Hoelder class #SIGMA#(#beta#, L) with unknown parameters #beta#, L. This assumption allows to choose a family of kernel in an optimal way and the resulting procedure appears to be asymptotically optimal in the adaptive sense. (orig.)SIGLEAvailable from TIB Hannover: RR 5549(229)+a / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekDEGerman

    Competing against the best nearest neighbor filter in regression

    No full text
    Abstract. Designing statistical procedures that are provably almost as accurate as the best one in a given family is one of central topics in statistics and learning theory. Oracle inequalities offer then a convenient theoretical framework for evaluating different strategies, which can be roughly classified into two classes: selection and aggregation strategies. The ultimate goal is to design strategies satisfying oracle inequalities with leading constant one and rate-optimal residual term. In many recent papers, this problem is addressed in the case where the aim is to beat the best procedure from a given family of linear smoothers. However, the theory developed so far either does not cover the important case of nearest-neighbor smoothers or provides a suboptimal oracle inequality with a leading constant considerably larger than one. In this paper, we prove a new oracle inequality with leading constant one that is valid under a general assumption on linear smoothers allowing, for instance, to compete against the best nearest-neighbor filters
    corecore