We suggest a new method, called Functional Additive Regression, or FAR, for
efficiently performing high-dimensional functional regression. FAR extends the
usual linear regression model involving a functional predictor, X(t), and a
scalar response, Y, in two key respects. First, FAR uses a penalized least
squares optimization approach to efficiently deal with high-dimensional
problems involving a large number of functional predictors. Second, FAR extends
beyond the standard linear regression setting to fit general nonlinear additive
models. We demonstrate that FAR can be implemented with a wide range of penalty
functions using a highly efficient coordinate descent algorithm. Theoretical
results are developed which provide motivation for the FAR optimization
criterion. Finally, we show through simulations and two real data sets that FAR
can significantly outperform competing methods.Comment: Published at http://dx.doi.org/10.1214/15-AOS1346 in the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org