191 research outputs found
On Algorithms Based on Joint Estimation of Currents and Contrast in Microwave Tomography
This paper deals with improvements to the contrast source inversion method
which is widely used in microwave tomography. First, the method is reviewed and
weaknesses of both the criterion form and the optimization strategy are
underlined. Then, two new algorithms are proposed. Both of them are based on
the same criterion, similar but more robust than the one used in contrast
source inversion. The first technique keeps the main characteristics of the
contrast source inversion optimization scheme but is based on a better
exploitation of the conjugate gradient algorithm. The second technique is based
on a preconditioned conjugate gradient algorithm and performs simultaneous
updates of sets of unknowns that are normally processed sequentially. Both
techniques are shown to be more efficient than original contrast source
inversion.Comment: 12 pages, 12 figures, 5 table
Scaled Projected-Directions Methods with Application to Transmission Tomography
Statistical image reconstruction in X-Ray computed tomography yields
large-scale regularized linear least-squares problems with nonnegativity
bounds, where the memory footprint of the operator is a concern. Discretizing
images in cylindrical coordinates results in significant memory savings, and
allows parallel operator-vector products without on-the-fly computation of the
operator, without necessarily decreasing image quality. However, it
deteriorates the conditioning of the operator. We improve the Hessian
conditioning by way of a block-circulant scaling operator and we propose a
strategy to handle nondiagonal scaling in the context of projected-directions
methods for bound-constrained problems. We describe our implementation of the
scaling strategy using two algorithms: TRON, a trust-region method with exact
second derivatives, and L-BFGS-B, a linesearch method with a limited-memory
quasi-Newton Hessian approximation. We compare our approach with one where a
change of variable is made in the problem. On two reconstruction problems, our
approach converges faster than the change of variable approach, and achieves
much tighter accuracy in terms of optimality residual than a first-order
method.Comment: 19 pages, 7 figure
Unsupervised image segmentation using a telegraph parameterization of Pickard random fields
This communication presents a nonsupervised three-dimensional segmentation method based upon a discrete-level unilateral Markov field model for the labels and conditionally Gaussian densities for the observed voxels. Such models have been shown to yield numerically efficient algorithms, for segmentation and for estimation of the model parameters as well. Our contribution is twofold. First, we deal with the degeneracy of the likelihood function with respect to the parameters of the Gaussian densities, which is a well-known problem for such mixture models. We introduce a bounded penalized likelihood function that has been recently shown to provide a consistent estimator in the simpler cases of independent Gaussian mixtures. On the other hand, implementation with EM reestimation formulas remains possible with only limited changes with respect to the standard case. Second, we propose a telegraphic parametrization of the unilateral Markov field. On a theoretical level, this parametrization ensures that some important properties of the field (e.g., stationarity) do hold. On a practical level, it reduces the computational complexity of the algorithm used in the segmentation and parameter estimation stages of the procedure. In addition, it decreases the number of model parameters that must be estimated, thereby improving convergence speed and accuracy of the corresponding estimation method
On the stopping criterion for numerical methods for linear systems with additive Gaussian noise
We consider the inversion of a linear operator with centered Gaussian white noise by MAP
estimation with a Gaussian prior distribution on the solution. The actual estimator is computed approximately by a numerical method. We propose a relation between the stationarity
measure of this approximate solution to the mean square error of the exact solution. This
relation enables the formulation of a stopping test for the numerical method, met only by iterates that satisfy chosen statistical properties. We extend this development to Gibbs priors
using a quadratic extrapolation of the log-likelihood maximized by the MAP estimator
Restauration par champs de Markov 3D à potentiels convexes appliquée aux images tomographiques
- Nous présentons une méthode de restauration d'images par champs de Markov 3D que nous avons développée pour obtenir la très grande précision requise pour fabriquer des prothèses personnalisées du genou. Dans le contexte bayésien du maximum a posteriori, les images des coupes du fémur à restaurer sont modélisées par des champs aléatoires de Markov qui permettent de traduire les caractéristiques locales des images. En raison de la géométrie particulière du problème, nous avons développé un modèle a priori où l'interaction entre les coupes successives est aussi prise en compte sous la forme d'un champ de Markov 3D. Ainsi, le comportement d'un pixel est conditionné par celui des 8 pixels qui l'entourent dans le plan de coupe et par les 2 pixels les plus proches des coupes voisines. Le choix de potentiels de Gibbs convexes et l'hypothèse d'un bruit blanc gaussien et centré conduisent alors à un critère 3D convexe dont l'optimisation peut être faite directement par une méthode spécialement adaptée pour réduire le nombre d'opérations. La précision de la restauration et, plus encore, de la segmentation qui en découle est primordiale dans notre cas. La méthode a donc été testée sur des images tomographiques de fantômes de dimensions connues et, enfin, sur de vraies images de coupes du fémur. La méthode présentée permet une convergence rapide avec un faible volume de calculs et la restauration offre une précision plus grande que les méthodes utilisées actuellement
Coronary stent artifact reduction with an edge-enhancing reconstruction kernel : a prospective cross-sectional study with 256-slice CT
Purpose
Metallic artifacts can result in an artificial thickening of the coronary stent wall which can significantly impair computed tomography (CT) imaging in patients with coronary stents. The objective of this study is to assess in vivo visualization of coronary stent wall and lumen with an edge-enhancing CT reconstruction kernel, as compared to a standard kernel.
Methods
This is a prospective cross-sectional study involving the assessment of 71 coronary stents (24 patients), with blinded observers. After 256-slice CT angiography, image reconstruction was done with medium-smooth and edge-enhancing kernels. Stent wall thickness was measured with both orthogonal and circumference methods, averaging thickness from diameter and circumference measurements, respectively. Image quality was assessed quantitatively using objective parameters (noise, signal to noise (SNR) and contrast to noise (CNR) ratios), as well as visually using a 5-point Likert scale.
Results
Stent wall thickness was decreased with the edge-enhancing kernel in comparison to the standard kernel, either with the orthogonal (0.97 ± 0.02 versus 1.09 ± 0.03 mm, respectively; p<0.001) or the circumference method (1.13 ± 0.02 versus 1.21 ± 0.02 mm, respectively; p = 0.001). The edge-enhancing kernel generated less overestimation from nominal thickness compared to the standard kernel, both with the orthogonal (0.89 ± 0.19 versus 1.00 ± 0.26 mm, respectively; p<0.001) and the circumference (1.06 ± 0.26 versus 1.13 ± 0.31 mm, respectively; p = 0.005) methods. The edge-enhancing kernel was associated with lower SNR and CNR, as well as higher background noise (all p < 0.001), in comparison to the medium-smooth kernel. Stent visual scores were higher with the edge-enhancing kernel (p<0.001).
Conclusion
In vivo 256-slice CT assessment of coronary stents shows that the edge-enhancing CT reconstruction kernel generates thinner stent walls, less overestimation from nominal thickness, and better image quality scores than the standard kernel
- …