24 research outputs found

    Multivariate data-driven k-NN function estimation

    Get PDF
    AbstractIn Bhattacharya and Mack (Ann. Statist. 15 (1987), 976–994), it was shown (among other things) that adapting for the optimal choice of k in univariate k-nearest neighbor density and regression estimation is feasible using weak convergence techniques. We now show that the same holds true for the multivariate case. Our results parallel Krieger and Pickands (Ann. Statist. 9 (1981), 1066–1078) and Mack and Müller (J. Multivariate Anal. 23 (1987), 169–182) for adaptive multivariate kernel density, respectively, regression, estimation

    A Karnel Regression of Phillips' Data

    Full text link
    Economists have assumed that the Phillips curve, which shows a positive (negative) relation between inflation and the output ratio (unemployment rate), may be mapped off the aggregate demand -aggregate supply apparatus. The paper shows that the Phillips curve requires that unlikely restrictions be put on the form of the aggregate supply and aggregate demand curves. In this case, it is inappropriate to treat data on inflation and capacity utilization as the basis for estimating an underlying formal model. The paper therefore uses a nonparametric, data-driven method to describe the data. This method, of kernel regression, shows the inflation-unemployment association in Phillips's sample to be negative on a global scale, yet irregular within particular ranges of unemployment

    Adaptive nonparametric estimation of a multivariate regression function

    No full text
    We consider the kernel estimation of a multivariate regression function at a point. Theoretical choices of the bandwidth are possible for attaining minimum mean squared error or for local scaling, in the sense of asymptotic distribution. However, these choices are not available in practice. We follow the approach of Krieger and Pickands (Ann. Statist.9 (1981) 1066-1078) and Abramson (J. Multivariate Anal.12 (1982), 562-567) in constructing adaptive estimates after demonstrating the weak convergence of some error process. As consequences, efficient data-driven consistent estimation is feasible, and data-driven local scaling is also feasible. In the latter instance, nearest-neighbor-type estimates and variance-stabilizing estimates are obtained as special cases.multivariate kernel regression estimation bias variance asymptotic normality mean square error tightness weak convergence in C[a,b] Gaussian process adaptation
    corecore