25,285 research outputs found
Penalized Orthogonal-Components Regression for Large p Small n Data
We propose a penalized orthogonal-components regression (POCRE) for large p
small n data. Orthogonal components are sequentially constructed to maximize,
upon standardization, their correlation to the response residuals. A new
penalization framework, implemented via empirical Bayes thresholding, is
presented to effectively identify sparse predictors of each component. POCRE is
computationally efficient owing to its sequential construction of leading
sparse principal components. In addition, such construction offers other
properties such as grouping highly correlated predictors and allowing for
collinear or nearly collinear predictors. With multivariate responses, POCRE
can construct common components and thus build up latent-variable models for
large p small n data.Comment: 12 page
Functional Regression
Functional data analysis (FDA) involves the analysis of data whose ideal
units of observation are functions defined on some continuous domain, and the
observed data consist of a sample of functions taken from some population,
sampled on a discrete grid. Ramsay and Silverman's 1997 textbook sparked the
development of this field, which has accelerated in the past 10 years to become
one of the fastest growing areas of statistics, fueled by the growing number of
applications yielding this type of data. One unique characteristic of FDA is
the need to combine information both across and within functions, which Ramsay
and Silverman called replication and regularization, respectively. This article
will focus on functional regression, the area of FDA that has received the most
attention in applications and methodological development. First will be an
introduction to basis functions, key building blocks for regularization in
functional regression methods, followed by an overview of functional regression
methods, split into three types: [1] functional predictor regression
(scalar-on-function), [2] functional response regression (function-on-scalar)
and [3] function-on-function regression. For each, the role of replication and
regularization will be discussed and the methodological development described
in a roughly chronological manner, at times deviating from the historical
timeline to group together similar methods. The primary focus is on modeling
and methodology, highlighting the modeling structures that have been developed
and the various regularization approaches employed. At the end is a brief
discussion describing potential areas of future development in this field
Penalized Orthogonal-Components Regression for Large p Small n Data
We propose a penalized orthogonal-components regression (POCRE) for large p
small n data. Orthogonal components are sequentially constructed to maximize,
upon standardization, their correlation to the response residuals. A new
penalization framework, implemented via empirical Bayes thresholding, is
presented to effectively identify sparse predictors of each component. POCRE is
computationally efficient owing to its sequential construction of leading
sparse principal components. In addition, such construction offers other
properties such as grouping highly correlated predictors and allowing for
collinear or nearly collinear predictors. With multivariate responses, POCRE
can construct common components and thus build up latent-variable models for
large p small n data.Comment: 12 page
- …