28 research outputs found
Test for mean matrix in GMANOVA model under heteroscedasticity and non-normality for high-dimensional data
This paper is concerned with the testing bilateral linear hypothesis on the
mean matrix in the context of the generalized multivariate analysis of variance
(GMANOVA) model when the dimensions of the observed vector may exceed the
sample size, the design may become unbalanced, the population may not be
normal, or the true covariance matrices may be unequal. The suggested testing
methodology can treat many problems such as the one- and two-way MANOVA tests,
the test for parallelism in profile analysis, etc., as specific ones. We
propose a bias-corrected estimator of the Frobenius norm for the mean matrix,
which is a key component of the test statistic. The null and non-null
distributions are derived under a general high-dimensional asymptotic framework
that allows the dimensionality to arbitrarily exceed the sample size of a
group, thereby establishing consistency for the testing criterion. The accuracy
of the proposed test in a finite sample is investigated through simulations
conducted for several high-dimensional scenarios and various underlying
population distributions in combination with different within-group covariance
structures. Finally, the proposed test is applied to a high-dimensional two-way
MANOVA problem for DNA microarray data.Comment: Supplementary is available as ancillary file
Feature Informativeness, Curse-of-Dimensionality and Error Probability in Discriminant Analysis
This thesis is based on four papers on high-dimensional discriminant analysis. Throughout, the curse-of-dimensionality effect on the precision of the discrimination performance is emphasized. A growing dimension asymptotic approach is used for assessing this effect and the limiting error probability are taken as the performance criteria. A combined effect of a high dimensionality and feature informativeness on the discrimination performance is evaluated
Graphical posterior predictive classifier: Bayesian model averaging with particle Gibbs
In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of Olsson et al. (2016) for posterior sampling from decomposable graphical models which utilizes the Christmas tree algorithm of Olsson et al. (2017) as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allow to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers
Graphical posterior predictive classifier: Bayesian model averaging with particle Gibbs
In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of Olsson et al. (2016) for posterior sampling from decomposable graphical models which utilizes the Christmas tree algorithm of Olsson et al. (2017) as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allow to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.QC 20170915</p
Graphical posterior predictive classifier: Bayesian model averaging with particle Gibbs
In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of Olsson et al. (2016) for posterior sampling from decomposable graphical models which utilizes the Christmas tree algorithm of Olsson et al. (2017) as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allow to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.QC 20170915</p
Graphical posterior predictive classifier: Bayesian model averaging with particle Gibbs
In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of Olsson et al. (2016) for posterior sampling from decomposable graphical models which utilizes the Christmas tree algorithm of Olsson et al. (2017) as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allow to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.QC 20170915</p