349 research outputs found

    Dynamic Linear Discriminant Analysis in High Dimensional Space

    Get PDF
    High-dimensional data that evolve dynamically feature predominantly in the modern data era. As a partial response to this, recent years have seen increasing emphasis to address the dimensionality challenge. However, the non-static nature of these datasets is largely ignored. This paper addresses both challenges by proposing a novel yet simple dynamic linear programming discriminant (DLPD) rule for binary classification. Different from the usual static linear discriminant analysis, the new method is able to capture the changing distributions of the underlying populations by modeling their means and covariances as smooth functions of covariates of interest. Under an approximate sparse condition, we show that the conditional misclassification rate of the DLPD rule converges to the Bayes risk in probability uniformly over the range of the variables used for modeling the dynamics, when the dimensionality is allowed to grow exponentially with the sample size. The minimax lower bound of the estimation of the Bayes risk is also established, implying that the misclassification rate of our proposed rule is minimax-rate optimal. The promising performance of the DLPD rule is illustrated via extensive simulation studies and the analysis of a breast cancer dataset.Comment: 34 pages; 3 figure

    Bootstrapping frequency domain tests in multivariate time series with an application to comparing spectral densities

    Get PDF
    We propose a general bootstrap procedure to approximate the null distribution of nonparametric frequency domain tests about the spectral density matrix of a multivariate time series. Under a set of easy to verify conditions, we establish asymptotic validity of the proposed bootstrap procedure. We apply a version of this procedure together with a new statistic in order to test the hypothesis that the spectral densities of not necessarily independent time series are equal. The test statistic proposed is based on a L2-distance between the nonparametrically estimated individual spectral densities and an overall, 'pooled' spectral density, the later being obtained using the whole set of m time series considered. The effects of the dependence between the time series on the power behavior of the test are investigated. Some simulations are presented and a real-life data example is discussed. --

    Variable-Dependent Partial Dimension Reduction

    Get PDF
    Sufficient dimension reduction reduces the dimension of a regression model without loss of information by replacing the original predictor with its lower-dimensional linear combinations. Partial (sufficient) dimension reduction arises when the predictors naturally fall into two sets X and W, and pursues a partial dimension reduction of X. Though partial dimension reduction is a very general problem, only very few research results are available when W is continuous. To the best of our knowledge, none can deal with the situation where the reduced lower-dimensional subspace of X varies with W. To address such issue, we in this paper propose a novel variable-dependent partial dimension reduction framework and adapt classical sufficient dimension reduction methods into this general paradigm. The asymptotic consistency of our method is investigated. Extensive numerical studies and real data analysis show that our variable-dependent partial dimension reduction method has superior performance compared to the existing methods

    Variable-Dependent Partial Dimension Reduction

    Get PDF
    Sufficient dimension reduction reduces the dimension of a regression model without loss of information by replacing the original predictor with its lower-dimensional linear combinations. Partial (sufficient) dimension reduction arises when the predictors naturally fall into two sets X and W, and pursues a partial dimension reduction of X. Though partial dimension reduction is a very general problem, only very few research results are available when W is continuous. To the best of our knowledge, none can deal with the situation where the reduced lower-dimensional subspace of X varies with W. To address such issue, we in this paper propose a novel variable-dependent partial dimension reduction framework and adapt classical sufficient dimension reduction methods into this general paradigm. The asymptotic consistency of our method is investigated. Extensive numerical studies and real data analysis show that our variable-dependent partial dimension reduction method has superior performance compared to the existing methods
    corecore