125,881 research outputs found

    Total variation regularization for manifold-valued data

    Full text link
    We consider total variation minimization for manifold valued data. We propose a cyclic proximal point algorithm and a parallel proximal point algorithm to minimize TV functionals with â„“p\ell^p-type data terms in the manifold case. These algorithms are based on iterative geodesic averaging which makes them easily applicable to a large class of data manifolds. As an application, we consider denoising images which take their values in a manifold. We apply our algorithms to diffusion tensor images, interferometric SAR images as well as sphere and cylinder valued images. For the class of Cartan-Hadamard manifolds (which includes the data space in diffusion tensor imaging) we show the convergence of the proposed TV minimizing algorithms to a global minimizer

    COMBINING PATCH-BASED ESTIMATION AND TOTAL VARIATION REGULARIZATION FOR 3D INSAR RECONSTRUCTION

    No full text
    International audienceIn this paper we propose a new approach for height retrieval using multi-channel SAR interferometry. It combines patch-based estimation and total variation regularization to provide a regularized height estimate. The non-local likelihood term adaptation relies on NL-SAR method, and the global optimization is realized through graph-cut minimization. The method is evaluated both with synthetic and real experiments

    VIVA: An Online Algorithm for Piecewise Curve Estimation Using ℓ\u3csup\u3e0\u3c/sup\u3e Norm Regularization

    Get PDF
    Many processes deal with piecewise input functions, which occur naturally as a result of digital commands, user interfaces requiring a confirmation action, or discrete-time sampling. Examples include the assembly of protein polymers and hourly adjustments to the infusion rate of IV fluids during treatment of burn victims. Estimation of the input is straightforward regression when the observer has access to the timing information. More work is needed if the input can change at unknown times. Successful recovery of the change timing is largely dependent on the choice of cost function minimized during parameter estimation. Optimal estimation of a piecewise input will often proceed by minimization of a cost function which includes an estimation error term (most commonly mean square error) and the number (cardinality) of input changes (number of commands). Because the cardinality (â„“0 norm) is not convex, the â„“2 norm (quadratic smoothing) and â„“1 norm (total variation minimization) are often substituted because they permit the use of convex optimization algorithms. However, these penalize the magnitude of input changes and therefore bias the piecewise estimates. Another disadvantage is that global optimization methods must be run after the end of data collection. One approach to unbiasing the piecewise parameter fits would include application of total variation minimization to recover timing, followed by piecewise parameter fitting. Another method is presented herein: a dynamic programming approach which iteratively develops populations of candidate estimates of increasing length, pruning those proven to be dominated. Because the usage of input data is entirely causal, the algorithm recovers timing and parameter values online. A functional definition of the algorithm, which is an extension of Viterbi decoding and integrates the pruning concept from branch-and-bound, is presented. Modifications are introduced to improve handling of non-uniform sampling, non-uniform confidence, and burst errors. Performance tests using synthesized data sets as well as volume data from a research system recording fluid infusions show five-fold (piecewise-constant data) and 20-fold (piecewise-linear data) reduction in error compared to total variation minimization, along with improved sparsity and reduced sensitivity to the regularization parameter. Algorithmic complexity and delay are also considered

    Total Variation as a local filter

    Get PDF
    International audienceIn the Rudin-Osher-Fatemi (ROF) image denoising model, Total Variation (TV) is used as a global regularization term. However, as we observe, the local interactions induced by Total Variation do not propagate much at long distances in practice, so that the ROF model is not far from being a local filter. In this paper, we propose to build a purely local filter by considering the ROF model in a given neighborhood of each pixel. We show that appropriate weights are required to avoid aliasing-like effects, and we provide an explicit convergence criterion for an associated dual minimization algorithm based on Chambolle's work. We study theoretical properties of the obtained local filter, and show that this localization of the ROF model brings an interesting optimization of the bias-variance trade-off, and a strong reduction a ROF drawback called "staircasing effect". We finally present a new denoising algorithm, TV-means, that efficiently combines the idea of local TV-filtering with the non-local means patch-based method
    • …
    corecore