1,406 research outputs found
Confidence driven TGV fusion
We introduce a novel model for spatially varying variational data fusion,
driven by point-wise confidence values. The proposed model allows for the joint
estimation of the data and the confidence values based on the spatial coherence
of the data. We discuss the main properties of the introduced model as well as
suitable algorithms for estimating the solution of the corresponding biconvex
minimization problem and their convergence. The performance of the proposed
model is evaluated considering the problem of depth image fusion by using both
synthetic and real data from publicly available datasets
Combining Contrast Invariant L1 Data Fidelities with Nonlinear Spectral Image Decomposition
This paper focuses on multi-scale approaches for variational methods and
corresponding gradient flows. Recently, for convex regularization functionals
such as total variation, new theory and algorithms for nonlinear eigenvalue
problems via nonlinear spectral decompositions have been developed. Those
methods open new directions for advanced image filtering. However, for an
effective use in image segmentation and shape decomposition, a clear
interpretation of the spectral response regarding size and intensity scales is
needed but lacking in current approaches. In this context, data
fidelities are particularly helpful due to their interesting multi-scale
properties such as contrast invariance. Hence, the novelty of this work is the
combination of -based multi-scale methods with nonlinear spectral
decompositions. We compare with scale-space methods in view of
spectral image representation and decomposition. We show that the contrast
invariant multi-scale behavior of promotes sparsity in the spectral
response providing more informative decompositions. We provide a numerical
method and analyze synthetic and biomedical images at which decomposition leads
to improved segmentation.Comment: 13 pages, 7 figures, conference SSVM 201
TV-Stokes And Its Variants For Image Processing
The total variational minimization with a Stokes constraint, also known as the TV-Stokes model, has been considered as one of the most successful models in image processing, especially in image restoration and sparse-data-based 3D surface reconstruction. This thesis studies the TV-Stokes model and its existing variants, proposes new and more effective variants of the model and their algorithms applied to some of the most interesting image processing problems.
We first review some of the variational models that already exist, in particular the TV-Stokes model and its variants. Common techniques like the augmented Lagrangian and the dual formulation, are also introduced. We then present our models as new variants of the TV-Stokes.
The main focus of the work has been on the sparse surface reconstruction of 3D surfaces. A model (WTR) with a vector fidelity, that is the gradient vector fidelity, has been proposed, applying it to both 3D cartoon design and height map reconstruction. The model employs the second-order total variation minimization, where the curl-free condition is satisfied automatically. Because the model couples both the height and the gradient vector representing the surface in the same minimization, it constructs the surface correctly. A variant of this model is then introduced, which includes a vector matching term. This matching term gives the model capability to accurately represent the shape of a geometry in the reconstruction. Experiments show a significant improvement over the state-of-the-art models, such as the TV model, higher order TV models, and the anisotropic third-order regularization model, when applied to some general applications.
In another work, the thesis generalizes the TV-Stokes model from two dimensions to an arbitrary number of dimensions, introducing a convenient form for the constraint in order it to be extended to higher dimensions.
The thesis explores also the idea of feature accumulation through iterative regularization in another work, introducing a Richardson-like iteration for the TV-Stokes. Thisis then followed by a more general model, a combined model, based on the modified variant of the TV-stokes. The resulting model is found to be equivalent to the well-known TGV model.
The thesis introduces some interesting numerical strategies for the solution of the TV-Stokes model and its variants. Higher order PDEs are turned into inhomogeneous modified Helmholtz equations through transformations. These equations are then solved using the preconditioned conjugate gradients method or the fast Fourier transformation. The thesis proposes a simple but quite general approach to finding closed form solutions to a general L1 minimization problem, and applies it to design algorithms for our models.Doktorgradsavhandlin
BM3D Frames and Variational Image Deblurring
A family of the Block Matching 3-D (BM3D) algorithms for various imaging
problems has been recently proposed within the framework of nonlocal patch-wise
image modeling [1], [2]. In this paper we construct analysis and synthesis
frames, formalizing the BM3D image modeling and use these frames to develop
novel iterative deblurring algorithms. We consider two different formulations
of the deblurring problem: one given by minimization of the single objective
function and another based on the Nash equilibrium balance of two objective
functions. The latter results in an algorithm where the denoising and
deblurring operations are decoupled. The convergence of the developed
algorithms is proved. Simulation experiments show that the decoupled algorithm
derived from the Nash equilibrium formulation demonstrates the best numerical
and visual results and shows superiority with respect to the state of the art
in the field, confirming a valuable potential of BM3D-frames as an advanced
image modeling tool.Comment: Submitted to IEEE Transactions on Image Processing on May 18, 2011.
implementation of the proposed algorithm is available as part of the BM3D
package at http://www.cs.tut.fi/~foi/GCF-BM3
Doctor of Philosophy
dissertationDynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is a powerful tool to detect cardiac diseases and tumors, and both spatial resolution and temporal resolution are important for disease detection. Sampling less in each time frame and applying sophisticated reconstruction methods to overcome image degradations is a common strategy in the literature. In this thesis, temporal TV constrained reconstruction that was successfully applied to DCE myocardial perfusion imaging by our group was extended to three-dimensional (3D) DCE breast and 3D myocardial perfusion imaging, and the extension includes different forms of constraint terms and various sampling patterns. We also explored some other popular reconstruction algorithms from a theoretical level and showed that they can be included in a unified framework. Current 3D Cartesian DCE breast tumor imaging is limited in spatiotemporal resolution as high temporal resolution is desired to track the contrast enhancement curves, and high spatial resolution is desired to discern tumor morphology. Here temporal TV constrained reconstruction was extended and different forms of temporal TV constraints were compared on 3D Cartesian DCE breast tumor data with simulated undersampling. Kinetic parameters analysis was used to validate the methods
Generating structured non-smooth priors and associated primal-dual methods
The purpose of the present chapter is to bind together and extend some recent developments regarding data-driven non-smooth regularization techniques in image processing through the means of a bilevel minimization scheme. The scheme, considered in function space, takes advantage of a dualization framework and it is designed to produce spatially varying regularization parameters adapted to the data for well-known regularizers, e.g. Total Variation and Total Generalized variation, leading to automated (monolithic), image reconstruction workflows. An inclusion of the theory of bilevel optimization and the theoretical background of the dualization framework, as well as a brief review of the aforementioned regularizers and their parameterization, makes this chapter a self-contained one. Aspects of the numerical implementation of the scheme are discussed and numerical examples are provided
- …