Electronic Thesis or DissertationSufficient Dimension Reduction (SDR) aims to identify a central subspace (CS), where the projection of predictors onto the subspace retains all information about the response variable without loss of information, thereby achieving dimension reduction. SDR methods typically fall into two main frameworks: the inverse approach and the forward approach. In practice, the basis matrix of the CS is the primary object of interest. The inverse approach naturally fulfills the orthogonality constraint of the basis matrix through eigenvalue decomposition. In contrast, the forward approach often requires extra steps to meet the requirement. We propose new forward SDR methods that preserve orthogonality constraint via manifold optimization. The Grassmann least squares Dimension Reduction (\textbf{glsDR}) algorithm estimates the central mean subspace (CMS) under semiparametric regression framework. Unlike existing forward methods that require re-orthogonalization, our approach is based on adaptive gradient descent on the Grassmann manifold, ensuring orthogonality at each iteration. As a first order method, \textbf{glsDR} is computationally more efficient than the second order alternatives. The Central Subspace Grassmann least squares Dimension Reduction (\textbf{CS-glsDR}) algorithm extends the method to exhaustive estimation of the central subspace (CS) by replacing observed response with an empirical estimate of its conditional distribution. Additionally, we propose the Grassmann ensemble expectile Dimension Reduction (\textbf{geeDR}) algorithm, which employs the ensemble strategy, recovering the CS by modeling a sequence of conditional expectiles. The effectiveness of the proposed methods is demonstrated through extensive simulation studies and real data applications
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.