235 research outputs found

    An offline/online procedure for dual norm calculations of parameterized functionals: empirical quadrature and empirical test spaces

    Full text link
    We present an offline/online computational procedure for computing the dual norm of parameterized linear functionals. The key elements of the approach are (i) an empirical test space for the manifold of Riesz elements associated with the parameterized functional, and (ii) an empirical quadrature procedure to efficiently deal with parametrically non-affine terms. We present a number of theoretical results to identify the different sources of error and to motivate the technique. Finally, we show the effectiveness of our approach to reduce both offline and online costs associated with the computation of the time-averaged residual indicator proposed in [Fick, Maday, Patera, Taddei, Journal of Computational Physics, 2018 (accepted)]

    Optimization with Sparsity-Inducing Penalties

    Get PDF
    Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate non-smooth norms. The goal of this paper is to present from a general perspective optimization tools and techniques dedicated to such sparsity-inducing penalties. We cover proximal methods, block-coordinate descent, reweighted â„“2\ell_2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provide an extensive set of experiments to compare various algorithms from a computational point of view

    The solution path of the generalized lasso

    Full text link
    We present a path algorithm for the generalized lasso problem. This problem penalizes the â„“1\ell_1 norm of a matrix D times the coefficient vector, and has a wide range of applications, dictated by the choice of D. Our algorithm is based on solving the dual of the generalized lasso, which greatly facilitates computation of the path. For D=ID=I (the usual lasso), we draw a connection between our approach and the well-known LARS algorithm. For an arbitrary D, we derive an unbiased estimate of the degrees of freedom of the generalized lasso fit. This estimate turns out to be quite intuitive in many applications.Comment: Published in at http://dx.doi.org/10.1214/11-AOS878 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore