115,350 research outputs found

    Robust estimation of dimension reduction space

    Get PDF
    Most dimension reduction methods based on nonparametric smoothing are highly sensitive to outliers and to data coming from heavy-tailed distributions. We show that the recently proposed methods by Xia et al. (2002) can be made robust in such a way that preserves all advantages of the original approach. Their extension based on the local one-step M-estimators is su±ciently robust to outliers and data from heavy tailed distributions, it is relatively easy to implement, and surprisingly, it performs as well as the original methods when applied to normally distributed data.Dimension reduction, Nonparametric regression, M-estimation

    Nonparametric Independence Screening in Sparse Ultra-High Dimensional Varying Coefficient Models

    Full text link
    The varying-coefficient model is an important nonparametric statistical model that allows us to examine how the effects of covariates vary with exposure variables. When the number of covariates is big, the issue of variable selection arrives. In this paper, we propose and investigate marginal nonparametric screening methods to screen variables in ultra-high dimensional sparse varying-coefficient models. The proposed nonparametric independence screening (NIS) selects variables by ranking a measure of the nonparametric marginal contributions of each covariate given the exposure variable. The sure independent screening property is established under some mild technical conditions when the dimensionality is of nonpolynomial order, and the dimensionality reduction of NIS is quantified. To enhance practical utility and the finite sample performance, two data-driven iterative NIS methods are proposed for selecting thresholding parameters and variables: conditional permutation and greedy methods, resulting in Conditional-INIS and Greedy-INIS. The effectiveness and flexibility of the proposed methods are further illustrated by simulation studies and real data applications

    Nonparametric estimation when data on derivatives are available

    Full text link
    We consider settings where data are available on a nonparametric function and various partial derivatives. Such circumstances arise in practice, for example in the joint estimation of cost and input functions in economics. We show that when derivative data are available, local averages can be replaced in certain dimensions by nonlocal averages, thus reducing the nonparametric dimension of the problem. We derive optimal rates of convergence and conditions under which dimension reduction is achieved. Kernel estimators and their properties are analyzed, although other estimators, such as local polynomial, spline and nonparametric least squares, may also be used. Simulations and an application to the estimation of electricity distribution costs are included.Comment: Published at http://dx.doi.org/10.1214/009053606000001127 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    AN ADAPTIVE COMPOSITE QUANTILE APPROACH TO DIMENSION REDUCTION

    Get PDF
    Sufficient dimension reduction [Li 1991] has long been a prominent issue in multivariate nonparametric regression analysis. To uncover the central dimension reduction space, we propose in this paper an adaptive composite quantile approach. Compared to existing methods, (1) it requires minimal assumptions and is capable of revealing all dimension reduction directions; (2) it is robust against outliers and (3) it is structure-adaptive, thus more efficient. Asymptotic results are proved and numerical examples are provided, including a real data analysis

    Inequality and development: Evidence from semiparametric estimation with panel data

    Get PDF
    Evidences from nonparametric and semiparametric unbalanced panel data models with fixed effects show that Kuznet’s inverted-U relationship is confirmed when economic development reaches a threshold. The model tests justify semiparametric specification. The integrated net contribution of control variables to inequality reduction is significant.Kuznet’s inverted-U, Semiparametric model, Unbalanced panel data

    The Shape and Dimensionality of Phylogenetic Tree-Space Based on Mitochondrial Genomes

    Get PDF
    Phylogenetic analyses of large and diverse data sets generally result in large sets of competing phylogenetic trees. Consensus tree methods used to summarize sets of competing trees discard important information regarding the similarity and distribution of competing trees. A more fine grain approach is to use a dimensionality reduction method to project tree-to-tree distances in 2D or 3D space. In this study, we systematically evaluate the performance of several nonlinear dimensionality reduction (NLDR) methods on tree-to-tree distances obtained from independent nonparametric bootstrap analyses of genes from three mid- to large-sized mitochondrial genome alignments.
&#xa
    • …
    corecore