39,398 research outputs found

    Penalized Likelihood and Bayesian Function Selection in Regression Models

    Full text link
    Challenging research in various fields has driven a wide range of methodological advances in variable selection for regression models with high-dimensional predictors. In comparison, selection of nonlinear functions in models with additive predictors has been considered only more recently. Several competing suggestions have been developed at about the same time and often do not refer to each other. This article provides a state-of-the-art review on function selection, focusing on penalized likelihood and Bayesian concepts, relating various approaches to each other in a unified framework. In an empirical comparison, also including boosting, we evaluate several methods through applications to simulated and real data, thereby providing some guidance on their performance in practice

    Smoothing â„“1\ell_1-penalized estimators for high-dimensional time-course data

    Full text link
    When a series of (related) linear models has to be estimated it is often appropriate to combine the different data-sets to construct more efficient estimators. We use â„“1\ell_1-penalized estimators like the Lasso or the Adaptive Lasso which can simultaneously do parameter estimation and model selection. We show that for a time-course of high-dimensional linear models the convergence rates of the Lasso and of the Adaptive Lasso can be improved by combining the different time-points in a suitable way. Moreover, the Adaptive Lasso still enjoys oracle properties and consistent variable selection. The finite sample properties of the proposed methods are illustrated on simulated data and on a real problem of motif finding in DNA sequences.Comment: Published in at http://dx.doi.org/10.1214/07-EJS103 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Optimal designs for smoothing splines

    Get PDF
    In the common nonparametric regression model we consider the problem of constructing optimal designs, if the unknown curve is estimated by a smoothing spline. A new basis for the space of natural splines is derived, and the local minimax property for these splines is used to derive two optimality criteria for the construction of optimal designs. The first criterion determines the design for a most precise estimation of the coefficients in the spline representation and corresponds to D-optimality, while the second criterion is the G-criterion and corresponds to an accurate prediction of the curve. Several properties of the optimal designs are derived. In general D- and G-optimal designs are not equivalent. Optimal designs are determined numerically and compared with the uniform design. --smoothing spline,nonparametric regression,D- and G-optimal designs,saturated designs
    • …
    corecore