We propose a supervised principal component regression method for relating
functional responses with high dimensional predictors. Unlike the conventional
principal component analysis, the proposed method builds on a newly defined
expected integrated residual sum of squares, which directly makes use of the
association between the functional response and the predictors. Minimizing the
integrated residual sum of squares gives the supervised principal components,
which is equivalent to solving a sequence of nonconvex generalized Rayleigh
quotient optimization problems. We reformulate the nonconvex optimization
problems into a simultaneous linear regression with a sparse penalty to deal
with high dimensional predictors. Theoretically, we show that the reformulated
regression problem can recover the same supervised principal subspace under
suitable conditions. Statistically, we establish non-asymptotic error bounds
for the proposed estimators. We demonstrate the advantages of the proposed
method through both numerical experiments and an application to the Human
Connectome Project fMRI data