22,416 research outputs found
Nonparametric Conditional Inference for Regression Coefficients with Application to Configural Polysampling
We consider inference procedures, conditional on an observed ancillary
statistic, for regression coefficients under a linear regression setup where
the unknown error distribution is specified nonparametrically. We establish
conditional asymptotic normality of the regression coefficient estimators under
regularity conditions, and formally justify the approach of plugging in
kernel-type density estimators in conditional inference procedures. Simulation
results show that the approach yields accurate conditional coverage
probabilities when used for constructing confidence intervals. The plug-in
approach can be applied in conjunction with configural polysampling to derive
robust conditional estimators adaptive to a confrontation of contrasting
scenarios. We demonstrate this by investigating the conditional mean squared
error of location estimators under various confrontations in a simulation
study, which successfully extends configural polysampling to a nonparametric
context
Spatial aggregation of local likelihood estimates with applications to classification
This paper presents a new method for spatially adaptive local (constant)
likelihood estimation which applies to a broad class of nonparametric models,
including the Gaussian, Poisson and binary response models. The main idea of
the method is, given a sequence of local likelihood estimates (``weak''
estimates), to construct a new aggregated estimate whose pointwise risk is of
order of the smallest risk among all ``weak'' estimates. We also propose a new
approach toward selecting the parameters of the procedure by providing the
prescribed behavior of the resulting estimate in the simple parametric
situation. We establish a number of important theoretical results concerning
the optimality of the aggregated estimate. In particular, our ``oracle'' result
claims that its risk is, up to some logarithmic multiplier, equal to the
smallest risk for the given family of estimates. The performance of the
procedure is illustrated by application to the classification problem. A
numerical study demonstrates its reasonable performance in simulated and
real-life examples.Comment: Published in at http://dx.doi.org/10.1214/009053607000000271 the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Optimal cross-validation in density estimation with the -loss
We analyze the performance of cross-validation (CV) in the density estimation
framework with two purposes: (i) risk estimation and (ii) model selection. The
main focus is given to the so-called leave--out CV procedure (Lpo), where
denotes the cardinality of the test set. Closed-form expressions are
settled for the Lpo estimator of the risk of projection estimators. These
expressions provide a great improvement upon -fold cross-validation in terms
of variability and computational complexity. From a theoretical point of view,
closed-form expressions also enable to study the Lpo performance in terms of
risk estimation. The optimality of leave-one-out (Loo), that is Lpo with ,
is proved among CV procedures used for risk estimation. Two model selection
frameworks are also considered: estimation, as opposed to identification. For
estimation with finite sample size , optimality is achieved for large
enough [with ] to balance the overfitting resulting from the
structure of the model collection. For identification, model selection
consistency is settled for Lpo as long as is conveniently related to the
rate of convergence of the best estimator in the collection: (i) as
with a parametric rate, and (ii) with some
nonparametric estimators. These theoretical results are validated by simulation
experiments.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1240 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Sequential Data-Adaptive Bandwidth Selection by Cross-Validation for Nonparametric Prediction
We consider the problem of bandwidth selection by cross-validation from a
sequential point of view in a nonparametric regression model. Having in mind
that in applications one often aims at estimation, prediction and change
detection simultaneously, we investigate that approach for sequential kernel
smoothers in order to base these tasks on a single statistic. We provide
uniform weak laws of large numbers and weak consistency results for the
cross-validated bandwidth. Extensions to weakly dependent error terms are
discussed as well. The errors may be {\alpha}-mixing or L2-near epoch
dependent, which guarantees that the uniform convergence of the cross
validation sum and the consistency of the cross-validated bandwidth hold true
for a large class of time series. The method is illustrated by analyzing
photovoltaic data.Comment: 26 page
Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study
Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced
- …