31 research outputs found

    On-the-fly Approximation of Multivariate Total Variation Minimization

    Full text link
    In the context of change-point detection, addressed by Total Variation minimization strategies, an efficient on-the-fly algorithm has been designed leading to exact solutions for univariate data. In this contribution, an extension of such an on-the-fly strategy to multivariate data is investigated. The proposed algorithm relies on the local validation of the Karush-Kuhn-Tucker conditions on the dual problem. Showing that the non-local nature of the multivariate setting precludes to obtain an exact on-the-fly solution, we devise an on-the-fly algorithm delivering an approximate solution, whose quality is controlled by a practitioner-tunable parameter, acting as a trade-off between quality and computational cost. Performance assessment shows that high quality solutions are obtained on-the-fly while benefiting of computational costs several orders of magnitude lower than standard iterative procedures. The proposed algorithm thus provides practitioners with an efficient multivariate change-point detection on-the-fly procedure

    Prediction bounds for higher order total variation regularized least squares

    Full text link
    We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of (k−1)th(k-1)^{\rm th} order differences. Our approach is based on combining a general oracle inequality for the ℓ1\ell_1-penalized least squares estimator with "interpolating vectors" to upper-bound the "effective sparsity". This allows one to show that the ℓ1\ell_1-penalty on the kthk^{\text{th}} order differences leads to an estimator that can adapt to the number of jumps in the (k−1)th(k-1)^{\text{th}} order differences of the underlying signal or an approximation thereof. We show the result for k∈{1,2,3,4}k \in \{1,2,3,4\} and indicate how it could be derived for general k∈Nk\in \mathbb{N}.Comment: 28 page

    Robust Poisson Surface Reconstruction

    Full text link
    Abstract. We propose a method to reconstruct surfaces from oriented point clouds with non-uniform sampling and noise by formulating the problem as a convex minimization that reconstructs the indicator func-tion of the surface’s interior. Compared to previous models, our recon-struction is robust to noise and outliers because it substitutes the least-squares fidelity term by a robust Huber penalty; this allows to recover sharp corners and avoids the shrinking bias of least squares. We choose an implicit parametrization to reconstruct surfaces of unknown topology and close large gaps in the point cloud. For an efficient representation, we approximate the implicit function by a hierarchy of locally supported basis elements adapted to the geometry of the surface. Unlike ad-hoc bases over an octree, our hierarchical B-splines from isogeometric analysis locally adapt the mesh and degree of the splines during reconstruction. The hi-erarchical structure of the basis speeds-up the minimization and efficiently represents clustered data. We also advocate for convex optimization, in-stead isogeometric finite-element techniques, to efficiently solve the min-imization and allow for non-differentiable functionals. Experiments show state-of-the-art performance within a more flexible framework.

    Continuous-Domain Solutions of Linear Inverse Problems with Tikhonov vs. Generalized TV Regularization

    Get PDF
    We consider linear inverse problems that are formulated in the continuous domain. The object of recovery is a function that is assumed to minimize a convex objective functional. The solutions are constrained by imposing a continuous-domain regularization. We derive the parametric form of the solution (representer theorems) for Tikhonov (quadratic) and generalized total-variation (gTV) regularizations. We show that, in both cases, the solutions are splines that are intimately related to the regularization operator. In the Tikhonov case, the solution is smooth and constrained to live in a fixed subspace that depends on the measurement operator. By contrast, the gTV regularization results in a sparse solution composed of only a few dictionary elements that are upper-bounded by the number of measurements and independent of the measurement operator. Our findings for the gTV regularization resonates with the minimization of the l1l_1 norm, which is its discrete counterpart and also produces sparse solutions. Finally, we find the experimental solutions for some measurement models in one dimension. We discuss the special case when the gTV regularization results in multiple solutions and devise an algorithm to find an extreme point of the solution set which is guaranteed to be sparse
    corecore