412 research outputs found

    Exact block-wise optimization in group lasso and sparse group lasso for linear regression

    Full text link
    The group lasso is a penalized regression method, used in regression problems where the covariates are partitioned into groups to promote sparsity at the group level. Existing methods for finding the group lasso estimator either use gradient projection methods to update the entire coefficient vector simultaneously at each step, or update one group of coefficients at a time using an inexact line search to approximate the optimal value for the group of coefficients when all other groups' coefficients are fixed. We present a new method of computation for the group lasso in the linear regression case, the Single Line Search (SLS) algorithm, which operates by computing the exact optimal value for each group (when all other coefficients are fixed) with one univariate line search. We perform simulations demonstrating that the SLS algorithm is often more efficient than existing computational methods. We also extend the SLS algorithm to the sparse group lasso problem via the Signed Single Line Search (SSLS) algorithm, and give theoretical results to support both algorithms.Comment: We have been made aware of the earlier work by Puig et al. (2009) which derives the same result for the (non-sparse) group lasso setting. We leave this manuscript available as a technical report, to serve as a reference for the previously untreated sparse group lasso case, and for timing comparisons of various methods in the group lasso setting. The manuscript is updated to include this referenc

    Extended Bayesian Information Criteria for Gaussian Graphical Models

    Full text link
    Gaussian graphical models with sparsity in the inverse covariance matrix are of significant interest in many modern applications. For the problem of recovering the graphical structure, information criteria provide useful optimization objectives for algorithms searching through sets of graphs or for selection of tuning parameters of other methods such as the graphical lasso, which is a likelihood penalization technique. In this paper we establish the consistency of an extended Bayesian information criterion for Gaussian graphical models in a scenario where both the number of variables p and the sample size n grow. Compared to earlier work on the regression case, our treatment allows for growth in the number of non-zero parameters in the true model, which is necessary in order to cover connected graphs. We demonstrate the performance of this criterion on simulated data when used in conjunction with the graphical lasso, and verify that the criterion indeed performs better than either cross-validation or the ordinary Bayesian information criterion when p and the number of non-zero parameters q both scale with n

    Contraction and uniform convergence of isotonic regression

    Full text link
    We consider the problem of isotonic regression, where the underlying signal xx is assumed to satisfy a monotonicity constraint, that is, xx lies in the cone {x∈Rn:x1≀⋯≀xn}\{ x\in\mathbb{R}^n : x_1 \leq \dots \leq x_n\}. We study the isotonic projection operator (projection to this cone), and find a necessary and sufficient condition characterizing all norms with respect to which this projection is contractive. This enables a simple and non-asymptotic analysis of the convergence properties of isotonic regression, yielding uniform confidence bands that adapt to the local Lipschitz properties of the signal
    • …
    corecore