150,052 research outputs found

    Bounded variation and relaxed curvature of surfaces

    Full text link
    We consider a relaxed notion of energy of non-parametric codimension one surfaces that takes account of area, mean curvature, and Gauss curvature. It is given by the best value obtained by approximation with inscribed polyhedral surfaces. The BV and measure properties of functions with finite relaxed energy are studied. Concerning the total mean and Gauss curvature, the classical counterexample by Schwarz-Peano to the definition of area is also analyzed.Comment: 25 page

    On Lasso refitting strategies

    Full text link
    A well-know drawback of l_1-penalized estimators is the systematic shrinkage of the large coefficients towards zero. A simple remedy is to treat Lasso as a model-selection procedure and to perform a second refitting step on the selected support. In this work we formalize the notion of refitting and provide oracle bounds for arbitrary refitting procedures of the Lasso solution. One of the most widely used refitting techniques which is based on Least-Squares may bring a problem of interpretability, since the signs of the refitted estimator might be flipped with respect to the original estimator. This problem arises from the fact that the Least-Squares refitting considers only the support of the Lasso solution, avoiding any information about signs or amplitudes. To this end we define a sign consistent refitting as an arbitrary refitting procedure, preserving the signs of the first step Lasso solution and provide Oracle inequalities for such estimators. Finally, we consider special refitting strategies: Bregman Lasso and Boosted Lasso. Bregman Lasso has a fruitful property to converge to the Sign-Least-Squares refitting (Least-Squares with sign constraints), which provides with greater interpretability. We additionally study the Bregman Lasso refitting in the case of orthogonal design, providing with simple intuition behind the proposed method. Boosted Lasso, in contrast, considers information about magnitudes of the first Lasso step and allows to develop better oracle rates for prediction. Finally, we conduct an extensive numerical study to show advantages of one approach over others in different synthetic and semi-real scenarios.Comment: revised versio
    corecore