25 research outputs found

    An Accelerated Iterative Reweighted Least Squares Algorithm for Compressed Sensing MRI

    Full text link
    Compressed sensing for MRI (CS-MRI) attempts to recover an object from undersampled k-space data by minimizing sparsity-promoting regularization criteria. The iterative reweighted least squares (IRLS) algorithm can perform the minimization task by solving iteration-dependent linear systems, recursively. However, this process can be slow as the associated linear system is often poorly conditioned for ill-posed problems. We propose a new scheme based on the matrix inversion lemma (MIL) to accelerate the solving process. We demonstrate numerically for CS-MRI that our method provides significant speed-up compared to linear and nonlinear conjugate gradient algorithms, thus making it a promising alternative for such applications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85957/1/Fessler250.pd

    Parallel MR Image Reconstruction Using Augmented Lagrangian Methods

    Full text link
    Magnetic resonance image (MRI) reconstruction using SENSitivity Encoding (SENSE) requires regularization to suppress noise and aliasing effects. Edge-preserving and sparsity-based regularization criteria can improve image quality, but they demand computation-intensive nonlinear optimization. In this paper, we present novel methods for regularized MRI reconstruction from undersampled sensitivity encoded data-SENSE-reconstruction-using the augmented Lagrangian (AL) framework for solving large-scale constrained optimization problems. We first formulate regularized SENSE-reconstruction as an unconstrained optimization task and then convert it to a set of (equivalent) constrained problems using variable splitting. We then attack these constrained versions in an AL framework using an alternating minimization method, leading to algorithms that can be implemented easily. The proposed methods are applicable to a general class of regularizers that includes popular edge-preserving (e.g., total-variation) and sparsity-promoting (e.g., -norm of wavelet coefficients) criteria and combinations thereof. Numerical experiments with synthetic and in vivo human data illustrate that the proposed AL algorithms converge faster than both general-purpose optimization algorithms such as nonlinear conjugate gradient (NCG) and state-of-the-art MFISTA.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85846/1/Fessler4.pd

    Monte Carlo SURE‐based parameter selection for parallel magnetic resonance imaging reconstruction

    Full text link
    Purpose Regularizing parallel magnetic resonance imaging (MRI) reconstruction significantly improves image quality but requires tuning parameter selection. We propose a Monte Carlo method for automatic parameter selection based on Stein's unbiased risk estimate that minimizes the multichannel k‐space mean squared error (MSE). We automatically tune parameters for image reconstruction methods that preserve the undersampled acquired data, which cannot be accomplished using existing techniques. Theory We derive a weighted MSE criterion appropriate for data‐preserving regularized parallel imaging reconstruction and the corresponding weighted Stein's unbiased risk estimate. We describe a Monte Carlo approximation of the weighted Stein's unbiased risk estimate that uses two evaluations of the reconstruction method per candidate parameter value. Methods We reconstruct images using the denoising sparse images from GRAPPA using the nullspace method (DESIGN) and L 1 iterative self‐consistent parallel imaging (L 1 ‐SPIRiT). We validate Monte Carlo Stein's unbiased risk estimate against the weighted MSE. We select the regularization parameter using these methods for various noise levels and undersampling factors and compare the results to those using MSE‐optimal parameters. Results Our method selects nearly MSE‐optimal regularization parameters for both DESIGN and L 1 ‐SPIRiT over a range of noise levels and undersampling factors. Conclusion The proposed method automatically provides nearly MSE‐optimal choices of regularization parameters for data‐preserving nonlinear parallel MRI reconstruction methods. Magn Reson Med 71:1760–1770, 2014. © 2013 Wiley Periodicals, Inc.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/106872/1/mrm24840.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/106872/2/mrm24840-sup-0001-suppinfo.pd

    Use of MOSFET dosimeters to validate Monte Carlo radiation treatment calculation in an anthropomorphic phantom

    Full text link
    Radiation therapy treatment planning based on Monte Carlo simulation provide a very accurate dose calculation compared to deterministic systems. Nowadays, Metal-Oxide-Semiconductor Field Effect Transistor (MOSFET) dosimeters are increasingly utilized in radiation therapy to verify the received dose by patients. In the present work, we have used the MCNP6 (Monte Carlo N-Particle transport code) to simulate the irradiation of an anthropomorphic phantom (RANDO) with a medical linear accelerator. The detailed model of the Elekta Precise multileaf collimator using a 6 MeV photon beam was designed and validated by means of different beam sizes and shapes in previous works. To include in the simulation the RANDO phantom geometry a set of Computer Tomography images of the phantom was obtained and formatted. The slices are input in PLUNC software, which performs the segmentation by defining anatomical structures and a Matlab algorithm writes the phantom information in MCNP6 input deck format. The simulation was verified and therefore the phantom model and irradiation was validated throughout the comparison of High-Sensitivity MOSFET dosimeter (Best medical Canada) measurements in different points inside the phantom with simulation results. On-line Wireless MOSFET provide dose estimation in the extremely thin sensitive volume, so a meticulous and accurate validation has been performed. The comparison show good agreement between the MOSFET measurements and the Monte Carlo calculations, confirming the validity of the developed procedure to include patients CT in simulations and approving the use of Monte Carlo simulations as an accurate therapy treatment plan.The authors wish to thank the Consorci Hospitalari Provincial de Castello (Spain) for their kind support in doing this work. This work was partially supported through "Programa para la innovacion e incentivacion" of the Universitat Politecnica de Valencia "INNOVA 2012". This work has been done under the project: Monte Carlo Treatment Planning System: Software for the high precision dosimetry calculations in radiotherapy (SP20120824).Juste Vidal, BJ.; Miró Herrero, R.; Abella Aranda, V.; Santos Serra, A.; Verdú Martín, GJ. (2015). Use of MOSFET dosimeters to validate Monte Carlo radiation treatment calculation in an anthropomorphic phantom. Radiation Physics and Chemistry. 116:208-213. https://doi.org/10.1016/j.radphyschem.2015.04.025S20821311

    Nonideal sampling and regularized interpolation of noisy data

    No full text
    Conventional sampling (Shannon's sampling formulation and its approximation-theoretic counterparts) and interpolation theories provide effective solutions to the problem of reconstructing a signal from its samples, but they are primarily restricted to the noise-free scenario. The purpose of this thesis is to extend the standard techniques so as to be able to handle noisy data. First, we consider a realistic setting where a multidimensional signal is prefiltered prior to sampling, and the samples are corrupted by additive noise. In order to counterbalance the effect of noise, the reconstruction problem is formulated in a variational framework where the solution is obtained by minimizing a continuous-domain Tikhonov-like L2-regularization subject to a ℓp-based data fidelity constraint. We present theoretical justification for the minimization of this cost functional and show that the global-minimum solution belongs to a shift-invariant space generated by a function that is generally not bandlimited. The optimal reconstruction space is characterized by a condition that links the generating function to the regularization operator and implies the existence of a B-spline-like basis. We also consider stochastic formulations – min-max and minimum mean-squared error (MMSE/Wiener) formulations – of the nonideal sampling problem and show that they yield the same type of estimators and point towards the existence of optimal shift-invariant spaces for certain classes of stochastic processes. In the stochastic context, we also derive an exact formula for the error of approximating a stationary stochastic signal in the presence of discrete additive noise and justify the noise-reducing effect of regularization through illustrations. Next, we focus on the use of a much wider class of non-quadratic regularization functionals for the problem of interpolation in the presence of noise. Starting from the afine-invariance of the solution, we show that the Lp-norm (p ≠ 2) is the most suitable type of non-quadratic regularization for our purpose. We give monotonically convergent numerical algorithms to carry out the minimization of the non-quadratic cost criterion. We also demonstrate experimentally that the proposed regularized interpolation scheme provides superior interpolation performance compared to standard methods in the presence of noise. Finally, we address the problem of selecting an appropriate value for the regularization parameter which is most crucial for the working of variational methods in general including those discussed in this thesis. We propose a practical scheme that is based on the concept of risk estimation to achieve minimum MSE performance. In this context, we first review a well known result due to Stein (Stein's unbiased risk estimate — SURE) that is applicable for data corrupted by additive Gaussian noise and also derive a new risk estimate for a Poisson-Gaussian mixture model that is appropriate for certain biomedical imaging applications. Next, we introduce a novel and efficient Monte-Carlo technique to compute SURE for arbitrary nonlinear algorithms. We demonstrate experimentally that the proposed Monte-Carlo SURE yields regularization parameter values that are close to the oracle-optimum (minimum MSE) for all methods considered in this work. We also present results that illustrate the applicability of our technique to a wide variety of algorithms in denoising and deconvolution

    Improved Bearing Estimation in Ocean Using Wavelet Denoising

    No full text
    Passive localisation and bearing estimation of underwater acoustic sources is a problem of great interest in the area of ocean acoustics. Bearing estimation techniques often perform poorly due to the low signal-to-noise ratio (SNR) at the sensor array. This paper proposes signal enhancement by wavelet denoising to improve the performance of the bearing estimation techniques. Methods have beendeveloped in the recent past to effectively perform wavelet denoising in the multisensor scenario (wavelet array denoising). Following one such approach, the acoustic signal received at the array is spatially decorrelated and then denoised. The denoised and recorrelated signal is then used for bearing estimation employing known bearing estimation techniques (MUSIC and Subspace Intersection). It is shown that wavelet array denoising improves the performance of the bearing estimators significantly. Also the case of perturbed arrays is considered as a special case for application of wavelet array denoising. Simulation results show that the denoising estimator has lower mean square error and higher resolution

    Blind optimization of algorithm parameters for signal denoising by Monte-Carlo sure

    No full text
    We consider the problem of optimizing the parameters of an arbitrary denoising algorithm by minimizing Stein's Unbiased Risk Estimate (SURE) which provides a means of assessing the true mean-squared-error (MSE) purely from the measured data assuming that it is corrupted by Gaussian noise. To accomplish this, we propose a novel Monte-Carlo technique based on a black-box approach which enables the user to compute SURE for an arbitrary denoising algorithm with some specific parameter setting. Our method only requires the response of the denoising algorithm to additional input noise and does not ask for any information about the functional. form of the corresponding denoising operator. This, therefore, permits SURE-based optimization of a wide variety of denoising algorithms (global-iterative, pointwise, etc). We present experimental results to justify our claims

    Recursive Risk Estimation For Non-Linear Image Deconvolution With A Wavelet-Domain Sparsity Constraint

    No full text
    We propose a recursive data-driven risk-estimation method for non-linear iterative deconvolution. Our two main contributions are 1) a solution-domain risk-estimation approach that is applicable to non-linear restoration algorithms for ill-conditioned inverse problems; and 2) a risk estimate for a state-of-the-art iterative procedure, the thresholded Landweber iteration, which enforces a wavelet-domain sparsity constraint. Our method can be used to estimate the SNR improvement at every step of the algorithm; e.g., for stopping the iteration after the highest value is reached. It can also be applied to estimate the optimal threshold level for a given number of iterations

    Monte-Carlo Sure: A black-box optimization of regularization parameters for general denoising algorithms

    Get PDF
    We consider the problem of optimizing the parameters of a given denoising algorithm for restoration of a signal corrupted by white Gaussian noise. To achieve this, we propose to minimize Stein’s unbiased risk estimate (SURE) which provides a means of assessing the true mean-squared error (MSE) purely from the measured data without need for any knowledge about the noise-free signal. Specifically, we present a novel Monte-Carlo technique which enables the user to calculate SURE for an arbitrary denoising algorithm characterized by some specific parameter setting. Our method is a black-box approach which solely uses the response of the denoising operator to additional input noise and does not ask for any information about its functional form. This, therefore, permits the use of SURE for optimization of a wide variety of denoising algorithms. We justify our claims by presenting experimental results for SURE-based optimization of a series of popular image-denoising algorithms such as total-variation denoising, wavelet soft-thresholding, and Wiener filtering/smoothing splines. In the process, we also compare the performance of these methods. We demonstrate numerically that SURE computed using the new approach accurately predicts the true MSE for all the considered algorithms. We also show that SURE uncovers the optimal values of the parameters in all cases
    corecore