746,798 research outputs found

    Minimum Average Deviance Estimation for Sufficient Dimension Reduction

    Full text link
    Sufficient dimension reduction reduces the dimensionality of data while preserving relevant regression information. In this article, we develop Minimum Average Deviance Estimation (MADE) methodology for sufficient dimension reduction. It extends the Minimum Average Variance Estimation (MAVE) approach of Xia et al. (2002) from continuous responses to exponential family distributions to include Binomial and Poisson responses. Local likelihood regression is used to learn the form of the regression function from the data. The main parameter of interest is a dimension reduction subspace which projects the covariates to a lower dimension while preserving their relationship with the outcome. To estimate this parameter within its natural space, we consider an iterative algorithm where one step utilizes a Stiefel manifold optimizer. We empirically evaluate the performance of three prediction methods, two that are intrinsic to local likelihood estimation and one that is based on the Nadaraya-Watson estimator. Initial results show that, as expected, MADE can outperform MAVE when there is a departure from the assumption of additive errors

    Testing predictor contributions in sufficient dimension reduction

    Full text link
    We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower-dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail

    Semiparametric Causal Sufficient Dimension Reduction Of High Dimensional Treatments

    Full text link
    Cause-effect relationships are typically evaluated by comparing the outcome responses to binary treatment values, representing two arms of a hypothetical randomized controlled trial. However, in certain applications, treatments of interest are continuous and high dimensional. For example, understanding the causal relationship between severity of radiation therapy, represented by a high dimensional vector of radiation exposure values and post-treatment side effects is a problem of clinical interest in radiation oncology. An appropriate strategy for making interpretable causal conclusions is to reduce the dimension of treatment. If individual elements of a high dimensional treatment vector weakly affect the outcome, but the overall relationship between the treatment variable and the outcome is strong, careless approaches to dimension reduction may not preserve this relationship. Moreover, methods developed for regression problems do not transfer in a straightforward way to causal inference due to confounding complications between the treatment and outcome. In this paper, we use semiparametric inference theory for structural models to give a general approach to causal sufficient dimension reduction of a high dimensional treatment such that the cause-effect relationship between the treatment and outcome is preserved. We illustrate the utility of our proposal through simulations and a real data application in radiation oncology

    Sufficient dimension reduction based on an ensemble of minimum average variance estimators

    Full text link
    We introduce a class of dimension reduction estimators based on an ensemble of the minimum average variance estimates of functions that characterize the central subspace, such as the characteristic functions, the Box--Cox transformations and wavelet basis. The ensemble estimators exhaustively estimate the central subspace without imposing restrictive conditions on the predictors, and have the same convergence rate as the minimum average variance estimates. They are flexible and easy to implement, and allow repeated use of the available sample, which enhances accuracy. They are applicable to both univariate and multivariate responses in a unified form. We establish the consistency and convergence rate of these estimators, and the consistency of a cross validation criterion for order determination. We compare the ensemble estimators with other estimators in a wide variety of models, and establish their competent performance.Comment: Published in at http://dx.doi.org/10.1214/11-AOS950 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Get PDF
    We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.
    corecore