367,195 research outputs found
Nonparametric and Varying Coefficient Modal Regression
In this article, we propose a new nonparametric data analysis tool, which we
call nonparametric modal regression, to investigate the relationship among
interested variables based on estimating the mode of the conditional density of
a response variable Y given predictors X. The nonparametric modal regression is
distinguished from the conventional nonparametric regression in that, instead
of the conditional average or median, it uses the "most likely" conditional
values to measures the center. Better prediction performance and robustness are
two important characteristics of nonparametric modal regression compared to
traditional nonparametric mean regression and nonparametric median regression.
We propose to use local polynomial regression to estimate the nonparametric
modal regression. The asymptotic properties of the resulting estimator are
investigated. To broaden the applicability of the nonparametric modal
regression to high dimensional data or functional/longitudinal data, we further
develop a nonparametric varying coefficient modal regression. A Monte Carlo
simulation study and an analysis of health care expenditure data demonstrate
some superior performance of the proposed nonparametric modal regression model
to the traditional nonparametric mean regression and nonparametric median
regression in terms of the prediction performance.Comment: 33 page
Recommended from our members
Nonparametric regression analysis
textNonparametric regression uses nonparametric and flexible methods in analyzing complex data with unknown regression relationships by imposing minimum assumptions on the regression function. The theory and applications of nonparametric regression methods with an emphasis on kernel regression, smoothing spines and Gaussian process regression are reviewed in this report. Two datasets are analyzed to demonstrate and compare the three nonparametric regression models in R.Statistic
Sparse Additive Models
We present a new class of methods for high-dimensional nonparametric
regression and classification called sparse additive models (SpAM). Our methods
combine ideas from sparse linear modeling and additive nonparametric
regression. We derive an algorithm for fitting the models that is practical and
effective even when the number of covariates is larger than the sample size.
SpAM is closely related to the COSSO model of Lin and Zhang (2006), but
decouples smoothing and sparsity, enabling the use of arbitrary nonparametric
smoothers. An analysis of the theoretical properties of SpAM is given. We also
study a greedy estimator that is a nonparametric version of forward stepwise
regression. Empirical results on synthetic and real data are presented, showing
that SpAM can be effective in fitting sparse nonparametric models in high
dimensional data
Penalized variable selection procedure for Cox models with semiparametric relative risk
We study the Cox models with semiparametric relative risk, which can be
partially linear with one nonparametric component, or multiple additive or
nonadditive nonparametric components. A penalized partial likelihood procedure
is proposed to simultaneously estimate the parameters and select variables for
both the parametric and the nonparametric parts. Two penalties are applied
sequentially. The first penalty, governing the smoothness of the multivariate
nonlinear covariate effect function, provides a smoothing spline ANOVA
framework that is exploited to derive an empirical model selection tool for the
nonparametric part. The second penalty, either the
smoothly-clipped-absolute-deviation (SCAD) penalty or the adaptive LASSO
penalty, achieves variable selection in the parametric part. We show that the
resulting estimator of the parametric part possesses the oracle property, and
that the estimator of the nonparametric part achieves the optimal rate of
convergence. The proposed procedures are shown to work well in simulation
experiments, and then applied to a real data example on sexually transmitted
diseases.Comment: Published in at http://dx.doi.org/10.1214/09-AOS780 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Heterogeneity and the nonparametric analysis of consumer choice: conditions for invertibility
This paper considers structural nonparametric random utility models for continuous
choice variables. It provides sufficient conditions on random preferences to yield reduced-
form systems of nonparametric stochastic demand functions that allow global invertibility
between demands and random utility components. Invertibility is essential for global
identifcation of structural consumer demand models, for the existence of well-specified
probability models of choice and for the nonparametric analysis of revealed stochastic
preference
Pointwise universal consistency of nonparametric linear estimators
This paper presents sufficient conditions for pointwise universal consistency of nonparametric delta estimators. We show the applicability of these conditions for some classes of nonparametric estimators
Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models
A variable screening procedure via correlation learning was proposed Fan and
Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models.
Even when the true model is linear, the marginal regression can be highly
nonlinear. To address this issue, we further extend the correlation learning to
marginal nonparametric learning. Our nonparametric independence screening is
called NIS, a specific member of the sure independence screening. Several
closely related variable screening procedures are proposed. Under the
nonparametric additive models, it is shown that under some mild technical
conditions, the proposed independence screening methods enjoy a sure screening
property. The extent to which the dimensionality can be reduced by independence
screening is also explicitly quantified. As a methodological extension, an
iterative nonparametric independence screening (INIS) is also proposed to
enhance the finite sample performance for fitting sparse additive models. The
simulation results and a real data analysis demonstrate that the proposed
procedure works well with moderate sample size and large dimension and performs
better than competing methods.Comment: 48 page
- …
