64 research outputs found

    Fast Quasi-Newton Algorithms for Penalized Reconstruction in Emission Tomography and Further Improvements via Preconditioning

    Get PDF
    OAPA This paper reports on the feasibility of using a quasi-Newton optimization algorithm, limited-memory Broyden- Fletcher-Goldfarb-Shanno with boundary constraints (L-BFGSB), for penalized image reconstruction problems in emission tomography (ET). For further acceleration, an additional preconditioning technique based on a diagonal approximation of the Hessian was introduced. The convergence rate of L-BFGSB and the proposed preconditioned algorithm (L-BFGS-B-PC) was evaluated with simulated data with various factors, such as the noise level, penalty type, penalty strength and background level. Data of three 18F-FDG patient acquisitions were also reconstructed. Results showed that the proposed L-BFGS-B-PC outperforms L-BFGS-B in convergence rate for all simulated conditions and the patient data. Based on these results, L-BFGSB- PC shows promise for clinical application

    An Investigation of Stochastic Variance Reduction Algorithms for Relative Difference Penalised 3D PET Image Reconstruction

    Get PDF
    Penalised PET image reconstruction algorithms are often accelerated during early iterations with the use of subsets. However, these methods may exhibit limit cycle behaviour at later iterations due to variations between subsets. Desirable converged images can be achieved for a subclass of these algorithms via the implementation of a relaxed step size sequence, but the heuristic selection of parameters will impact the quality of the image sequence and algorithm convergence rates. In this work, we demonstrate the adaption and application of a class of stochastic variance reduction gradient algorithms for PET image reconstruction using the relative difference penalty and numerically compare convergence performance to BSREM. The two investigated algorithms are: SAGA and SVRG. These algorithms require the retention in memory of recently computed subset gradients, which are utilised in subsequent updates. We present several numerical studies based on Monte Carlo simulated data and a patient data set for fully 3D PET acquisitions. The impact of the number of subsets, different preconditioners and step size methods on the convergence of regions of interest values within the reconstructed images is explored. We observe that when using constant preconditioning, SAGA and SVRG demonstrate reduced variations in voxel values between subsequent updates and are less reliant on step size hyper-parameter selection than BSREM reconstructions. Furthermore, SAGA and SVRG can converge significantly faster to the penalised maximum likelihood solution than BSREM, particularly in low count data

    Stochastic Optimisation Methods Applied to PET Image Reconstruction

    Get PDF
    Positron Emission Tomography (PET) is a medical imaging technique that is used to pro- vide functional information regarding physiological processes. Statistical PET reconstruc- tion attempts to estimate the distribution of radiotracer in the body but this methodology is generally computationally demanding because of the use of iterative algorithms. These algorithms are often accelerated by the utilisation of data subsets, which may result in con- vergence to a limit set rather than the unique solution. Methods exist to relax the update step sizes of subset algorithms but they introduce additional heuristic parameters that may result in extended reconstruction times. This work investigates novel methods to modify subset algorithms to converge to the unique solution while maintaining the acceleration benefits of subset methods. This work begins with a study of an automatic method for increasing subset sizes, called AutoSubsets. This algorithm measures the divergence between two distinct data subset update directions and, if significant, the subset size is increased for future updates. The algorithm is evaluated using both projection and list mode data. The algorithm’s use of small initial subsets benefits early reconstruction but unfortunately, at later updates, the subsets size increases too early, which impedes convergence rates. The main part of this work investigates the application of stochastic variance reduction optimisation algorithms to PET image reconstruction. These algorithms reduce variance due to the use of subsets by incorporating previously computed subset gradients into the update direction. The algorithms are adapted for the application to PET reconstruction. This study evaluates the reconstruction performance of these algorithms when applied to various 3D non-TOF PET simulated, phantom and patient data sets. The impact of a number of algorithm parameters are explored, which includes: subset selection methodologies, the number of subsets, step size methodologies and preconditioners. The results indicate that these stochastic variance reduction algorithms demonstrate superior performance after only a few epochs when compared to a standard PET reconstruction algorithm

    Regularization for Uniform Spatial Resolution Properties in Penalized-Likelihood Image Reconstruction

    Full text link
    Traditional space-invariant regularization methods in tomographic image reconstruction using penalized-likelihood estimators produce images with nonuniform spatial resolution properties. The local point spread functions that quantify the smoothing properties of such estimators are space variant, asymmetric, and object-dependent even for space invariant imaging systems. The authors propose a new quadratic regularization scheme for tomographic imaging systems that yields increased spatial uniformity and is motivated by the least-squares fitting of a parameterized local impulse response to a desired global response. The authors have developed computationally efficient methods for PET systems with shift-invariant geometric responses. They demonstrate the increased spatial uniformity of this new method versus conventional quadratic regularization schemes in simulated PET thorax scans.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85867/1/Fessler79.pd

    A Fast Convergent Ordered-Subsets Algorithm with Subiteration-Dependent Preconditioners for PET Image Reconstruction

    Full text link
    We investigated the imaging performance of a fast convergent ordered-subsets algorithm with subiteration-dependent preconditioners (SDPs) for positron emission tomography (PET) image reconstruction. In particular, we considered the use of SDP with the block sequential regularized expectation maximization (BSREM) approach with the relative difference prior (RDP) regularizer due to its prior clinical adaptation by vendors. Because the RDP regularization promotes smoothness in the reconstructed image, the directions of the gradients in smooth areas more accurately point toward the objective function's minimizer than those in variable areas. Motivated by this observation, two SDPs have been designed to increase iteration step-sizes in the smooth areas and reduce iteration step-sizes in the variable areas relative to a conventional expectation maximization preconditioner. The momentum technique used for convergence acceleration can be viewed as a special case of SDP. We have proved the global convergence of SDP-BSREM algorithms by assuming certain characteristics of the preconditioner. By means of numerical experiments using both simulated and clinical PET data, we have shown that the SDP-BSREM algorithms substantially improve the convergence rate, as compared to conventional BSREM and a vendor's implementation as Q.Clear. Specifically, SDP-BSREM algorithms converge 35\%-50\% faster in reaching the same objective function value than conventional BSREM and commercial Q.Clear algorithms. Moreover, we showed in phantoms with hot, cold and background regions that the SDP-BSREM algorithms approached the values of a highly converged reference image faster than conventional BSREM and commercial Q.Clear algorithms.Comment: 12 pages, 9 figure

    Joint Activity and Attenuation Reconstruction from Multiple Energy Window Data with Photopeak Scatter Re-Estimation in non-TOF 3D PET

    Get PDF
    Estimation of attenuation from PET data only is of interest for PET-MR and systems where CT is not available or recommended. However, when using data from a single energy window, emission-based non-TOF PET AC methods suffer from ‘cross-talk’ artefacts. Based on earlier work, this manuscript explores the hypothesis that cross-talk can be reduced by using more than one energy window. We propose an algorithm for the simultaneous estimation of both activity and attenuation images as well as the scatter component of the measured data from a PET acquisition, using multiple energy windows. The model for the measurements is 3D and accounts for the finite energy resolution of PET detectors; it is restricted to single scatter. The proposed MLAA-EB-S algorithm is compared with simultaneous estimation from a single energy window (MLAA-S). The evaluation is based on simulations using the characteristics of the Siemens mMR scanner. Phantoms of different complexity were investigated. In particular, a 3D XCAT torso phantom was used to assess the inpainting of attenuation values within the lung region. Results show that the cross-talk present in non-TOF MLAA reconstructions is significantly reduced when using multiple energy windows and indicate that the proposed approach warrants further investigation

    Development and implementation of efficient noise suppression methods for emission computed tomography

    Get PDF
    In PET and SPECT imaging, iterative reconstruction is now widely used due to its capability of incorporating into the reconstruction process a physics model and Bayesian statistics involved in photon detection. Iterative reconstruction methods rely on regularization terms to suppress image noise and render radiotracer distribution with good image quality. The choice of regularization method substantially affects the appearances of reconstructed images, and is thus a critical aspect of the reconstruction process. Major contributions of this work include implementation and evaluation of various new regularization methods. Previously, our group developed a preconditioned alternating projection algorithm (PAPA) to optimize the emission computed tomography (ECT) objective function with the non-differentiable total variation (TV) regularizer. The algorithm was modified to optimize the proposed reconstruction objective functions. First, two novel TV-based regularizers—high-order total variation (HOTV) and infimal convolution total variation (ICTV)—were proposed as alternative choices to the customary TV regularizer in SPECT reconstruction, to reduce “staircase” artifacts produced by TV. We have evaluated both proposed reconstruction methods (HOTV-PAPA and ICTV-PAPA), and compared them with the TV regularized reconstruction (TV-PAPA) and the clinical standard, Gaussian post-filtered, expectation-maximization reconstruction method (GPF-EM) using both Monte Carlo-simulated data and anonymized clinical data. Model-observer studies using Monte Carlo-simulated data indicate that ICTV-PAPA is able to reconstruct images with similar or better lesion detectability, compared with clinical standard GPF-EM methods, but at lower detected count levels. This implies that switching from GPF-EM to ICTV-PAPA can reduce patient dose while maintaining image quality for diagnostic use. Second, the 1 norm of discrete cosine transform (DCT)-induced framelet regularization was studied. We decomposed the image into high and low spatial-frequency components, and then preferentially penalized the high spatial-frequency components. The DCT-induced framelet transform of the natural radiotracer distribution image is sparse. By using this property, we were able to effectively suppress image noise without overly compromising spatial resolution or image contrast. Finally, the fractional norm of the first-order spatial gradient was introduced as a regularizer. We implemented 2/3 and 1/2 norms to suppress image spatial variability. Due to the strong penalty of small differences between neighboring pixels, fractional-norm regularizers suffer from similar cartoon-like artifacts as with the TV regularizer. However, when penalty weights are properly selected, fractional-norm regularizers outperform TV in terms of noise suppression and contrast recovery

    Large Scale Inverse Problems

    Get PDF
    This book is thesecond volume of a three volume series recording the "Radon Special Semester 2011 on Multiscale Simulation &amp Analysis in Energy and the Environment" that took placein Linz, Austria, October 3-7, 2011. This volume addresses the common ground in the mathematical and computational procedures required for large-scale inverse problems and data assimilation in forefront applications. The solution of inverse problems is fundamental to a wide variety of applications such as weather forecasting, medical tomography, and oil exploration. Regularisation techniques are needed to ensure solutions of sufficient quality to be useful, and soundly theoretically based. This book addresses the common techniques required for all the applications, and is thus truly interdisciplinary. This collection of survey articles focusses on the large inverse problems commonly arising in simulation and forecasting in the earth sciences

    Detection Efficiency Modelling and Joint Activity and Attenuation Reconstruction in non-TOF 3D PET from Multiple-Energy Window Data

    Get PDF
    Emission-based attenuation correction (AC) meth-ods offer the possibility of overcoming quantification errors induced by conventional MR-based approaches in PET/MR imaging. However, the joint problem of determining AC and the activity of interest is strongly ill-posed in non-TOF PET. This can be improved by exploiting the extra information arising from low energy window photons, but the feasibility of this approach has only been studied with relatively simplistic analytic simulations so far. This manuscript aims to address some of the remaining challenges needed to handle realistic measurements; in particular, the detection efficiency (“normalisation”) estimation for each energy window is investigated. An energy-dependent detection efficiency model is proposed, accounting for the presence of unscattered events in the lower energy window due to detector scatter. Geometric calibration factors are estimated prior to the reconstruction for both scattered and unscattered events. Different reconstruction methods are also compared. Results show that geometric factors differ markedly between the energy windows and that our analytical model correspond in good approximation to Monte Carlo simulation; the multiple energy window reconstruction appears sensitive to input/model mismatch. Our method applies to Monte Carlo generated data but can be extended to measured data. This study is restricted to single scatter events

    Improving Quantification in Lung PET/CT for the Evaluation of Disease Progression and Treatment Effectiveness

    Get PDF
    Positron Emission Tomography (PET) allows imaging of functional processes in vivo by measuring the distribution of an administered radiotracer. Whilst one of its main uses is directed towards lung cancer, there is an increased interest in diffuse lung diseases, for which the incidences rise every year, mainly due to environmental reasons and population ageing. However, PET acquisitions in the lung are particularly challenging due to several effects, including the inevitable cardiac and respiratory motion and the loss of spatial resolution due to low density, causing increased positron range. This thesis will focus on Idiopathic Pulmonary Fibrosis (IPF), a disease whose aetiology is poorly understood while patient survival is limited to a few years only. Contrary to lung tumours, this diffuse lung disease modifies the lung architecture more globally. The changes result in small structures with varying densities. Previous work has developed data analysis techniques addressing some of the challenges of imaging patients with IPF. However, robust reconstruction techniques are still necessary to obtain quantitative measures for such data, where it should be beneficial to exploit recent advances in PET scanner hardware such as Time of Flight (TOF) and respiratory motion monitoring. Firstly, positron range in the lung will be discussed, evaluating its effect in density-varying media, such as fibrotic lung. Secondly, the general effect of using incorrect attenuation data in lung PET reconstructions will be assessed. The study will compare TOF and non-TOF reconstructions and quantify the local and global artefacts created by data inconsistencies and respiratory motion. Then, motion compensation will be addressed by proposing a method which takes into account the changes of density and activity in the lungs during the respiration, via the estimation of the volume changes using the deformation fields. The method is evaluated on late time frame PET acquisitions using ¹⁸F-FDG where the radiotracer distribution has stabilised. It is then used as the basis for a method for motion compensation of the early time frames (starting with the administration of the radiotracer), leading to a technique that could be used for motion compensation of kinetic measures. Preliminary results are provided for kinetic parameters extracted from short dynamic data using ¹⁸F-FDG
    corecore