462 research outputs found

    Development of Spect and Ct Tomographic Image Reconstruction

    Get PDF
    The purpose of this study was to contribute to the advancement of statistically-based iterative reconstruction algorithms and protocols for both SPECT and micro CT data. Major contributions of this work to SPECT reconstruction include formulation and implementation of fully three-dimensional voxel-based system matrix in parallel-beam, fan-beam, and cone-beam collimator geometries while modeling the process of attenuation, system resolution and sensitivity. This is achieved by casting rays through a volume of voxels and using ray-voxel intersection lengths to determine approximate volume contributions. Qualitative and quantitative analysis of reconstructed Monte Carlo data sets show that this is a very effective and efficient method. Using this method, three SPECT studies were conducted. First, the reconstruction performance was studied for a triple-head cone-beam SPECT system using a helical orbit acquisition. We looked at various subset groupings for the ordered-subsets expectation maximization (OSEM) algorithm. We also examined how rotational and translational sampling affects reconstructed image quality when constrained by total injected dose and scan time. We conclude the following: When reconstructing noiseless datasets, varying the rotational sampling from 90 views to 360 views over 360 degrees does not affect the reconstructed activity regardless of the object size in terms of both convergence and accuracy. When using ordered subsets, the subset group arrangement is important in terms of both image quality and accuracy. The smaller the object is that is being reconstructed, the rate of convergence decreases, the spatial resolution decreases, and accuracy decreases. Second, we examined a system composed of three, possibly different, converging collimators using a circular orbit. We conclude the following: When reconstructing noiseless datasets, using a triple-cone beam system resulted in distortion artifacts along the axial direction and diminished resolution along the transaxial direction. Using a triple-fan beam system resulted in the best reconstructed image quality in terms of bias, noise, and contrast. When noisy datasets were reconstructed, a cone-cone-fan beam system resulted in best reconstructed image quality in terms of mean-to-actual ratio for small lesions and a triple-fan beam system for large lesions. Finally, a two-dimensional mesh-based system matrix for parallel-beam collimation with attenuation and resolution modeling was designed, implemented, and studied. We conclude that no more than two divisions per detector bin width are needed for satisfactory reconstruction. Also, using more than two divisions per detector bin does not significantly improve reconstructed images. A chapter on iterative micro-CT reconstruction is also included. Our contribution to micro-CT reconstruction is the formulation and implementation of a cone-beam system matrix that reduces ring artifacts associated with sampling of the reconstruction space. This new approach reduces the common 3 D ray-tracing technique into 2-D, making it very efficient. The images obtained using our approach are compared to images reconstructed by means of analytical techniques. We observe significant improvement in image quality for the images reconstructed using our iterative method

    Tomographic image quality of rotating slat versus parallel hole-collimated SPECT

    Get PDF
    Parallel and converging hole collimators are most frequently used in nuclear medicine. Less common is the use of rotating slat collimators for single photon emission computed tomography (SPECT). The higher photon collection efficiency, inherent to the geometry of rotating slat collimators, results in much lower noise in the data. However, plane integrals contain spatial information in only one direction, whereas line integrals provide two-dimensional information. It is not a trivial question whether the initial gain in efficiency will compensate for the lower information content in the plane integrals. Therefore, a comparison of the performance of parallel hole and rotating slat collimation is needed. This study compares SPECT with rotating slat and parallel hole collimation in combination with MLEM reconstruction with accurate system modeling and correction for scatter and attenuation. A contrast-to-noise study revealed an improvement of a factor 2-3 for hot lesions and more than a factor of 4 for cold lesion. Furthermore, a clinically relevant case of heart lesion detection is simulated for rotating slat and parallel hole collimators. In this case, rotating slat collimators outperform the traditional parallel hole collimators. We conclude that rotating slat collimators are a valuable alternative for parallel hole collimators

    Stationary, MR-compatible brain SPECT imaging based on multi-pinhole collimators

    Get PDF

    Relevance of accurate Monte Carlo modeling in nuclear medical imaging

    Get PDF
    Monte Carlo techniques have become popular in different areas of medical physics with advantage of powerful computing systems. In particular, they have been extensively applied to simulate processes involving random behavior and to quantify physical parameters that are difficult or even impossible to calculate by experimental measurements. Recent nuclear medical imaging innovations such as single-photon emission computed tomography (SPECT), positron emission tomography (PET), and multiple emission tomography (MET) are ideal for Monte Carlo modeling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors which have contributed to the wider use include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers. This paper presents derivation and methodological basis for this approach and critically reviews their areas of application in nuclear imaging. An overview of existing simulation programs is provided and illustrated with examples of some useful features of such sophisticated tools in connection with common computing facilities and more powerful multiple-processor parallel processing systems. Current and future trends in the field are also discussed

    Development and implementation of efficient noise suppression methods for emission computed tomography

    Get PDF
    In PET and SPECT imaging, iterative reconstruction is now widely used due to its capability of incorporating into the reconstruction process a physics model and Bayesian statistics involved in photon detection. Iterative reconstruction methods rely on regularization terms to suppress image noise and render radiotracer distribution with good image quality. The choice of regularization method substantially affects the appearances of reconstructed images, and is thus a critical aspect of the reconstruction process. Major contributions of this work include implementation and evaluation of various new regularization methods. Previously, our group developed a preconditioned alternating projection algorithm (PAPA) to optimize the emission computed tomography (ECT) objective function with the non-differentiable total variation (TV) regularizer. The algorithm was modified to optimize the proposed reconstruction objective functions. First, two novel TV-based regularizers—high-order total variation (HOTV) and infimal convolution total variation (ICTV)—were proposed as alternative choices to the customary TV regularizer in SPECT reconstruction, to reduce “staircase” artifacts produced by TV. We have evaluated both proposed reconstruction methods (HOTV-PAPA and ICTV-PAPA), and compared them with the TV regularized reconstruction (TV-PAPA) and the clinical standard, Gaussian post-filtered, expectation-maximization reconstruction method (GPF-EM) using both Monte Carlo-simulated data and anonymized clinical data. Model-observer studies using Monte Carlo-simulated data indicate that ICTV-PAPA is able to reconstruct images with similar or better lesion detectability, compared with clinical standard GPF-EM methods, but at lower detected count levels. This implies that switching from GPF-EM to ICTV-PAPA can reduce patient dose while maintaining image quality for diagnostic use. Second, the 1 norm of discrete cosine transform (DCT)-induced framelet regularization was studied. We decomposed the image into high and low spatial-frequency components, and then preferentially penalized the high spatial-frequency components. The DCT-induced framelet transform of the natural radiotracer distribution image is sparse. By using this property, we were able to effectively suppress image noise without overly compromising spatial resolution or image contrast. Finally, the fractional norm of the first-order spatial gradient was introduced as a regularizer. We implemented 2/3 and 1/2 norms to suppress image spatial variability. Due to the strong penalty of small differences between neighboring pixels, fractional-norm regularizers suffer from similar cartoon-like artifacts as with the TV regularizer. However, when penalty weights are properly selected, fractional-norm regularizers outperform TV in terms of noise suppression and contrast recovery

    Acceleration of GATE Monte Carlo simulations

    Get PDF
    Positron Emission Tomography (PET) and Single Photon Emission Computed Tomography are forms of medical imaging that produce functional images that reflect biological processes. They are based on the tracer principle. A biologically active substance, a pharmaceutical, is selected so that its spatial and temporal distribution in the body reflects a certain body function or metabolism. In order to form images of the distribution, the pharmaceutical is labeled with gamma-ray-emitting or positron-emitting radionuclides (radiopharmaceuticals or tracers). After administration of the tracer to a patient, an external position-sensitive gamma-ray camera can detect the emitted radiation to form a stack of images of the radionuclide distribution after a reconstruction process. Monte Carlo methods are numerical methods that use random numbers to compute quantities of interest. This is normally done by creating a random variable whose expected value is the desired quantity. One then simulates and tabulates the random variable and uses its sample mean and variance to construct probabilistic estimates. It represents an attempt to model nature through direct simulation of the essential dynamics of the system in question. Monte Carlo modeling is the method of choice for all applications where measurements are not feasible or where analytic models are not available due to the complex nature of the problem. In addition, such modeling is a practical approach in nuclear medical imaging in several important application fields: detector design, quantification, correction methods for image degradations, detection tasks etc. Several powerful dedicated Monte Carlo simulators for PET and/or SPECT are available. However, they are often not detailed nor flexible enough to enable realistic simulations of emission tomography detector geometries while also modeling time dependent processes such as decay, tracer kinetics, patient and bed motion, dead time or detector orbits. Our Monte Carlo simulator of choice, GEANT4 Application for Tomographic Emission (GATE), was specifically designed to address all these issues. The flexibility of GATE comes at a price however. The simulation of a simple prototype SPECT detector may be feasible within hours in GATE but an acquisition with a realistic phantom may take years to complete on a single CPU. In this dissertation we therefore focus on the Achilles’ heel of GATE: efficiency. Acceleration of GATE simulations can only be achieved through a combination of efficient data analysis, dedicated variance reduction techniques, fast navigation algorithms and parallelization. In the first part of this dissertation we consider the improvement of the analysis capabilities of GATE. The static analysis module in GATE is both inflexible and incapable of storing more detail without introducing a large computational overhead. However, the design and validation of the acceleration techniques in this dissertation requires a flexible, detailed and computationally efficient analysis module. To this end, we develop a new analysis framework capable of analyzing any process, from the decay of isotopes to particle interactions and detections in any detector element for any type of phantom. The evaluation of our framework consists of the assessment of spurious activity in 124I-Bexxar PET and of contamination in 131I-Bexxar SPECT. In the case of PET we describe how our framework can detect spurious coincidences generated by non-pure isotopes, even with realistic phantoms. We show that optimized energy thresholds, which can readily be applied in the clinic, can now be derived in order to minimize the contamination. We also show that the spurious activity itself is not spatially uniform. Therefore standard reconstruction and correction techniques are not adequate. In the case of SPECT we describe how it is now possible to classify detections into geometric detections, phantom scatter, penetration through the collimator, collimator scatter and backscatter in the end parts. We show that standard correction algorithms such as triple energy window correction cannot correct for septal penetration. We demonstrate that 124I PET with optimized energy thresholds offer better image quality than 131I SPECT when using standard reconstruction techniques. In the second part of this dissertation we focus on improving the efficiency of GATE with a variance reduction technique called Geometrical Importance Sampling (GIS). We describe how only 0.02% of all emitted photons can reach the crystal surface of a SPECT detector head with a low energy high resolution collimator. A lot of computing power is therefore wasted by tracking photons that will not contribute to the result. A twofold strategy is used to solve this problem: GIS employs Russian Roulette to discard those photons that will not likely contribute to the result. Photons in more important regions on the other hand are split into several photons with reduced weight to increase their survival chance. We show that this technique introduces branches into the particle history. We describe how this can be taken into account by a particle history tree that is used for the analysis of the results. The evaluation of GIS consists of energy spectra validation, spatial resolution and sensitivity for low and medium energy isotopes. We show that GIS reaches acceleration factors between 5 and 13 over analog GATE simulations for the isotopes in the study. It is a general acceleration technique that can be used for any isotope, phantom and detector combination. Although GIS is useful as a safe and accurate acceleration technique, it cannot deliver clinically acceptable simulation times. The main reason lies in its inability to force photons in a specific direction. In the third part of this dissertation we solve this problem for 99mTc SPECT simulations. Our approach is twofold. Firstly, we introduce two variance reduction techniques: forced detection (FD) and convolution-based forced detection (CFD) with multiple projection sampling (MPS). FD and CFD force copies of photons at decay and at every interaction point to be transported through the phantom in a direction sampled within a solid angle toward the SPECT detector head at all SPECT angles simultaneously. We describe how a weight must be assigned to each photon in order to compensate for the forced direction and non-absorption at emission and scatter. We show how the weights are calculated from the total and differential Compton and Rayleigh cross sections per electron with incorporation of Hubbell’s atomic form factor. In the case of FD all detector interactions are modeled by Monte Carlo, while in the case of CFD the detector is modeled analytically. Secondly, we describe the design of an FD and CFD specialized navigator to accelerate the slow tracking algorithms in GEANT4. The validation study shows that both FD and CFD closely match the analog GATE simulations and that we can obtain an acceleration factor between 3 (FD) and 6 (CFD) orders of magnitude over analog simulations. This allows for the simulation of a realistic acquisition with a torso phantom within 130 seconds. In the fourth part of this dissertation we exploit the intrinsic parallel nature of Monte Carlo simulations. We show how Monte Carlo simulations should scale linearly as a function of the number of processing nodes but that this is usually not achieved due to job setup time, output handling and cluster overhead. We describe how our approach is based on two steps: job distribution and output data handling. The job distribution is based on a time-domain partitioning scheme that retains all experimental parameters and that guarantees the statistical independence of each subsimulation. We also reduce the job setup time by the introduction of a parameterized collimator model for SPECT simulations. We reduce the data output handling time by a chain-based output merger. The scalability study is based on a set of simulations on a 70 CPU cluster and shows an acceleration factor of approximately 66 on 70 CPUs for both PET and SPECT.We also show that our method of parallelization does not introduce any approximations and that it can be readily combined with any of the previous acceleration techniques described above

    Doctor of Philosophy

    Get PDF
    dissertationSingle Photon Emission Computed Tomography (SPECT) myocardial perfusion imaging (MPI), a noninvasive and effective method for diagnosing coronary artery disease (CAD), is the most commonly performed SPECT procedure. Hence, it is not surprising that there is a tremendous market need for dedicated cardiac SPECT scanners. In this dissertation, a novel dedicated stationary cardiac SPECT system that using a segmented-parallel-hole collimator is investigated in detail. This stationary SPECT system can acquire true dynamic SPECT images and is inexpensive to build. A segmented-parallel-hole collimator was designed to fit the existing general-purpose SPECT cameras without any mechanical modifications of the scanner while providing higher detection sensitivity. With a segmented-parallel-hole collimator, each detector was segmented to seven sub-detector regions, providing seven projections simultaneously. Fourteen view-angles over 180 degree were obtained in total with two detectors positioned at 90 degree apart. The whole system was able to provide an approximate 34-fold gain in sensitivity over the conventional single-head SPECT system. The potential drawbacks of the stationary cardiac SPECT system are data truncation from small field of view (FOV) and limited number of view angles. A tailored maximum-likelihood expectation-maximization (ML-EM) algorithm was derived for reconstruction of truncated projections with few view angles. The artifacts caused by truncation and insufficient number of views were suppressed by reducing the image updating step sizes of the pixels outside the FOV. The performance of the tailored ML-EM algorithm was verified by computer simulations and phantom experiments. Compared with the conventional ML-EM algorithm, the tailored ML-EM algorithm successfully suppresses the streak artifacts outside the FOV and reduces the distortion inside the FOV. At 10 views, the tailored ML-EM algorithm has a much lower mean squared error (MSE) and higher relative contrast. In addition, special attention was given to handle the zero-valued projections in the image reconstruction. There are two categories of zero values in the projection data: one is outside the boundary of the object and the other is inside the object region, which is caused by count starvation. A positive weighting factor c was introduced to the ML-EM algorithm. By setting c>1 for zero values outside the projection, the boundary in the image is well preserved even at extremely low iterations. The black lines, caused by the zero values inside the object region, are completely removed by setting 0< c<1. Finally, the segmented-parallel-hole collimator was fabricated and calibrated using a point source. Closed-form explicit expressions for the slant angles and rotation radius were derived from the proposed system geometry. The geometric parameters were estimated independently or jointly. Monte Carlo simulations and real emission data were used to evaluate the proposed calibration method and the stationary cardiac system. The simulation results show that the difference between the estimated and the actual value is less than 0.1 degree for the slant angles and the 5 mm for the rotation radius, which is well below the detector's intrinsic resolution

    SPECT Imaging of Pulmonary Blood Flow in a Rat

    Get PDF
    Small animal imaging is experiencing rapid development due to its importance in providing high-throughput phenotypic data for functional genomics studies. We have developed a single photon emission computed tomography (SPECT) system to image the pulmonary perfusion distribution in the rat. A standard gamma camera, equipped with a pinhole collimator, was used to acquire SPECT projection images at 40 sec/view of the rat thorax following injection of Tc99m labeled albumin that accumulated in the rat\u27s lungs. A voxel-driven, ordered-subset expectation maximization reconstruction was implemented. Following SPECT imaging, the rat was imaged using micro-CT with Feldkamp conebeam reconstruction. The two reconstructed image volumes were fused to provide a structure/function image of the rat thorax. Reconstruction accuracy and performance were evaluated using numerical simulations and actual imaging of an experimental phantom consisting of Tc99m filled chambers with known diameters and count rates. Full-width half-maximum diameter measurement errors decreased with increasing chamber diameter, ranging from \u3c 6% down to 0.1%. Errors in the ratio of count rate estimates between tubes were also diameter dependent but still relatively small. This preliminary study suggests that SPECT will be useful for imaging and quantifying the pulmonary blood flow distribution and the distribution of Tc99m labeled ligands in the lungs of small laboratory animals

    Scattered Radiation Emission Imaging: Principles and Applications

    Get PDF
    Imaging processes built on the Compton scattering effect have been under continuing investigation since it was first suggested in the 50s. However, despite many innovative contributions, there are still formidable theoretical and technical challenges to overcome. In this paper, we review the state-of-the-art principles of the so-called scattered radiation emission imaging. Basically, it consists of using the cleverly collected scattered radiation from a radiating object to reconstruct its inner structure. Image formation is based on the mathematical concept of compounded conical projection. It entails a Radon transform defined on circular cone surfaces in order to express the scattered radiation flux density on a detecting pixel. We discuss in particular invertible cases of such conical Radon transforms which form a mathematical basis for image reconstruction methods. Numerical simulations performed in two and three space dimensions speak in favor of the viability of this imaging principle and its potential applications in various fields
    corecore