5 research outputs found

    A Smooth Estimator of Regression Function for Non-Negative Dependent Random Variables

    Get PDF
    Commonly used kernel regression estimators may not provide admissible values of the regression function or its functionals at the boundaries, for regressions with restricted support. Any smoothing method will become less accurate near the boundary of the observation interval because fewer observations can be averaged, and thus variance or bias can be affected. Here, we adapt Chaubey et al. (2007)'s method of density estimation for nonnegative random variables to define a smooth estimator of the regression function. The estimator is based on a generalization of Hille's lemma and a perturbation idea. Its uniform consistency and asymptotic normality are obtained, for the sake of generality, under a stationary ergodic process assumption for the data . The asymptotic mean squared error is derived and the optimal value of smoothing parameter is also discussed. Graphical illustration of the proposed estimator are provided on simulated as well as real-life data

    Generalized Kernel Regression Estimator for Dependent Size-Biased Data

    Get PDF
    This paper considers nonparametric regression estimation in the context of dependent biased non-negative data using a generalized asymmetric kernel. It may be applied to a wider variety of practical situations, such as the length and size biased data. We derive theoretical results using a deep asymptotic analysis of the behavior of the estimator that provides consistency and asymptotic normality in addition to the evaluation of the asymptotic bias term. The asymptotic mean squared error is also derived in order to obtain the optimal value of smoothing parameters required in the proposed estimator. The results are stated under a stationary ergodic assumption, without assuming any traditional mixing conditions. A simulation study is carried out to compare the proposed estimator with the local linear regression estimate

    Nonparametric kernel regression estimation for functional stationary ergodic data: Asymptotic properties

    No full text
    The aim of this paper is to study asymptotic properties of the kernel regression estimate whenever functional stationary ergodic data are considered. More precisely, in the ergodic data setting, we consider the regression of a real random variable Y over an explanatory random variable X taking values in some semi-metric abstract space. While estimating the regression function using the well-known Nadaraya-Watson estimator, we establish the consistency in probability, with a rate, as well as the asymptotic normality which induces a confidence interval for the regression function usable in practice since it does not depend on any unknown quantity. We also give the explicit form of the conditional bias term. Note that the ergodic framework is more convenient in practice since it does not need the verification of any condition as in the mixing case for example.Asymptotic normality Consistency Ergodic processes Functional dependent data Martingale difference Regression estimation
    corecore