83 research outputs found

    Recent developments in time-of-flight PET

    Get PDF
    While the first time-of-flight (TOF)-positron emission tomography (PET) systems were already built in the early 1980s, limited clinical studies were acquired on these scanners. PET was still a research tool, and the available TOF-PET systems were experimental. Due to a combination of low stopping power and limited spatial resolution (caused by limited light output of the scintillators), these systems could not compete with bismuth germanate (BGO)-based PET scanners. Developments on TOF system were limited for about a decade but started again around 2000. The combination of fast photomultipliers, scintillators with high density, modern electronics, and faster computing power for image reconstruction have made it possible to introduce this principle in clinical TOF-PET systems. This paper reviews recent developments in system design, image reconstruction, corrections, and the potential in new applications for TOF-PET. After explaining the basic principles of time-of-flight, the difficulties in detector technology and electronics to obtain a good and stable timing resolution are shortly explained. The available clinical systems and prototypes under development are described in detail. The development of this type of PET scanner also requires modified image reconstruction with accurate modeling and correction methods. The additional dimension introduced by the time difference motivates a shift from sinogram- to listmode-based reconstruction. This reconstruction is however rather slow and therefore rebinning techniques specific for TOF data have been proposed. The main motivation for TOF-PET remains the large potential for image quality improvement and more accurate quantification for a given number of counts. The gain is related to the ratio of object size and spatial extent of the TOF kernel and is therefore particularly relevant for heavy patients, where image quality degrades significantly due to increased attenuation (low counts) and high scatter fractions. The original calculations for the gain were based on analytical methods. Recent publications for iterative reconstruction have shown that it is difficult to quantify TOF gain into one factor. The gain depends on the measured distribution, the location within the object, and the count rate. In a clinical situation, the gain can be used to either increase the standardized uptake value (SUV) or reduce the image acquisition time or administered dose. The localized nature of the TOF kernel makes it possible to utilize local tomography reconstruction or to separate emission from transmission data. The introduction of TOF also improves the joint estimation of transmission and emission images from emission data only. TOF is also interesting for new applications of PET-like isotopes with low branching ratio for positron fraction. The local nature also reduces the need for fine angular sampling, which makes TOF interesting for limited angle situations like breast PET and online dose imaging in proton or hadron therapy. The aim of this review is to introduce the reader in an educational way into the topic of TOF-PET and to give an overview of the benefits and new opportunities in using this additional information

    Incorporating accurate statistical modeling in PET: reconstruction for whole-body imaging

    Get PDF
    Tese de doutoramento em BiofĂ­sica, apresentada Ă  Universidade de Lisboa atravĂ©s da Faculdade de CiĂȘncias, 2007The thesis is devoted to image reconstruction in 3D whole-body PET imaging. OSEM ( Ordered Subsets Expectation maximization ) is a statistical algorithm that assumes Poisson data. However, corrections for physical effects (attenuation, scattered and random coincidences) and detector efficiency remove the Poisson characteristics of these data. The Fourier Rebinning (FORE), that combines 3D imaging with fast 2D reconstructions, requires corrected data. Thus, if it will be used or whenever data are corrected prior to OSEM, the need to restore the Poisson-like characteristics is present. Restoring Poisson-like data, i.e., making the variance equal to the mean, was achieved through the use of weighted OSEM algorithms. One of them is the NECOSEM, relying on the NEC weighting transformation. The distinctive feature of this algorithm is the NEC multiplicative factor, defined as the ratio between the mean and the variance. With real clinical data this is critical, since there is only one value collected for each bin the data value itself. For simulated data, if we keep track of the values for these two statistical moments, the exact values for the NEC weights can be calculated. We have compared the performance of five different weighted algorithms (FORE+AWOSEM, FORE+NECOSEM, ANWOSEM3D, SPOSEM3D and NECOSEM3D) on the basis of tumor detectablity. The comparison was done for simulated and clinical data. In the former case an analytical simulator was used. This is the ideal situation, since all the weighting factors can be exactly determined. For comparing the performance of the algorithms, we used the Non-Prewhitening Matched Filter (NPWMF) numerical observer. With some knowledge obtained from the simulation study we proceeded to the reconstruction of clinical data. In that case, it was necessary to devise a strategy for estimating the NEC weighting factors. The comparison between reconstructed images was done by a physician largely familiar with whole-body PET imaging

    Investigation of accuracy in quantitation of 18F-FDG concentration of PET/CT

    Get PDF
    The PET/CT scanner has been recognized as a powerful diagnostic imaging modality in oncology and radiation treatment planning. Traditionally, PET has been used for quantitative analysis, and diagnostic interpretations of PET images greatly relied on a nuclear medicine physician’s experience and knowledge. The PET data set represents a positron emitter’s activity concentration as a gray scale in each pixel. The assurance of the quantitative accuracy of the PET data is critical for diagnosis and staging of disease and evaluation of treatment. The standard uptake value (SUV) is a widely employed parameter in clinical settings to distinguish malignant lesions from others. SUV is a rough normalization of radioactive tracer uptake where normal tissue uptake is unity. The PET scanner is a sensitive diagnostic method to detect small lesions such as lymph node metastasis less than 1 cm in diameter, whereas the CT scanner may be limited in detecting these lesions. The accuracy of quantitation of small lesions is critical for predicting prognosis or planning a treatment of the patient. PET/CT uses attenuation correction factors obtained from CT scanner data sets. Non-biological materials such as metals and contrast agents are recognized as a factor that leads to a wrong scaling factor in the PET image. We challenge the accuracy of the quantitative method that physicians routinely use as a parameter to distinguish malignant lesions from others under clinical settings in commercially available CT/PET scanners. First, we verified if we could recover constant activity concentration throughout the field of view for small identical activity concentration sources. Second, we tested how much the CT-based attenuation correction factor could be influenced by contrast agents. Third, we tested how much error in quantitation could be introduced by object size. Our data suggest that the routine normalization process of the PET scanner does not guarantee an accurate quantitation of discrete uniform activity sources in the PET/CT scanner. Also, activity concentrations greatly rely on an object’s dimensions and object size. A recovery correction factor is necessary on these quantitative data for oncological evaluation to assure accurate interpretation of the activity concentration. Development of parameters for quantitation other than SUV may overcome SUV’s inherent limitations reflecting patient-specific physiology and the imaging characteristics of individual scanners

    Assessment of the impact of modeling axial compression on PET image reconstruction

    Get PDF
    The file contains phantoms, sinograms and system matrices used in the following work: Martin A. Belzunce and Andrew J. Reader, "Assessment of the Impact of Modelling Axial Compression on PET Image Reconstruction", Medical Physics, 2017. This work is supported by Engineering and Physical Sciences Research Council (EPSRC) under grant EP/M020142/1

    Investigations into a positron emission imaging algorithm

    Get PDF
    Includes abstract.Includes bibliographical references.A positron emission imaging algorithm which makes use of the entire set of lines-of-response in list-mode form is presented. The algorithm parameterises the lines-of-response by a Cartesian mesh over the field-of-view of a Positron Emission Tomography (PET) scanner to find their density distribution throughout the mesh. The algorithm is applied to PET image reconstruction and Positron Emission Particle Tracking (PEPT). For the PET image reconstruction, a redistribution of the lines-of-response is employed to remove the discrete nature of the data caused by the finite size of the detector cells, and once the density distribution has been determined, it is filtered and corrected for attenuation. The algorithm is applied to static and dynamic systems of hard phantoms, biological specimens and fluid flow through a column. In the dynamic systems, timesteps as low as 1 second are achieved. The results from the algorithm are compared to the standard Radon transform reconstruction algorithm, and the presented algorithm is observed to produce images with superior edge contrast, smoothness and representation of the physical system

    Simulation of Clinical PET Studies for the Assessment of Quantification Methods

    Get PDF
    On this PhD thesis we developed a methodology for evaluating the robustness of SUV measurements based on MC simulations and the generation of novel databases of simulated studies based on digital anthropomorphic phantoms. This methodology has been applied to different problems related to quantification that were not previously addressed. Two methods for estimating the extravasated dose were proposed andvalidated in different scenarios using MC simulations. We studied the impact of noise and low counting in the accuracy and repeatability of three commonly used SUV metrics (SUVmax, SUVmean and SUV50). The same model was used to study the effect of physiological muscular uptake variations on the quantification of FDG-PET studies. Finally, our MC models were applied to simulate 18F-fluorocholine (FCH) studies. The aim was to study the effect of spill-in counts from neighbouring regions on the quantification of small regions close to high activity extended sources

    Topics in image reconstruction for high resolution positron emission tomography

    Get PDF
    Les problĂšmes mal posĂ©s reprĂ©sentent un sujet d'intĂ©rĂȘt interdisciplinaire qui surgires dans la tĂ©lĂ©dĂ©tection et des applications d'imagerie. Cependant, il subsiste des questions cruciales pour l'application rĂ©ussie de la thĂ©orie Ă  une modalitĂ© d'imagerie. La tomographie d'Ă©mission par positron (TEP) est une technique d'imagerie non-invasive qui permet d'Ă©valuer des processus biochimiques se dĂ©roulant Ă  l'intĂ©rieur d'organismes in vivo. La TEP est un outil avantageux pour la recherche sur la physiologie normale chez l'humain ou l'animal, pour le diagnostic et le suivi thĂ©rapeutique du cancer, et l'Ă©tude des pathologies dans le coeur et dans le cerveau. La TEP partage plusieurs similaritĂ©s avec d'autres modalitĂ©s d'imagerie tomographiques, mais pour exploiter pleinement sa capacitĂ© Ă  extraire le maximum d'information Ă  partir des projections, la TEP doit utiliser des algorithmes de reconstruction d'images Ă  la fois sophistiquĂ©e et pratiques. Plusieurs aspects de la reconstruction d'images TEP ont Ă©tĂ© explorĂ©s dans le prĂ©sent travail. Les contributions suivantes sont d'objet de ce travail: Un modĂšle viable de la matrice de transition du systĂšme a Ă©tĂ© Ă©laborĂ©, utilisant la fonction de rĂ©ponse analytique des dĂ©tecteurs basĂ©e sur l'attĂ©nuation linĂ©aire des rayons y dans un banc de dĂ©tecteur. Nous avons aussi dĂ©montrĂ© que l'utilisation d'un modĂšle simplifiĂ© pour le calcul de la matrice du systĂšme conduit Ă  des artefacts dans l'image. (IEEE Trans. Nucl. Sei., 2000) );> La modĂ©lisation analytique de la dĂ©pendance dĂ©crite Ă  l'Ă©gard de la statistique des images a simplifiĂ© l'utilisation de la rĂšgle d'arrĂȘt par contre-vĂ©rification (CV) et a permis d'accĂ©lĂ©rer la reconstruction statistique itĂ©rative. Cette rĂšgle peut ĂȘtre utilisĂ©e au lieu du procĂ©dĂ© CV original pour des projections aux taux de comptage Ă©levĂ©s, lorsque la rĂšgle CV produit des images raisonnablement prĂ©cises. (IEEE Trans. Nucl. Sei., 2001) Nous avons proposĂ© une mĂ©thodologie de rĂ©gularisation utilisant la dĂ©composition en valeur propre (DVP) de la matrice du systĂšme basĂ©e sur l'analyse de la rĂ©solution spatiale. L'analyse des caractĂ©ristiques du spectre de valeurs propres nous a permis d'identifier la relation qui existe entre le niveau optimal de troncation du spectre pour la reconstruction DVP et la rĂ©solution optimale dans l'image reconstruite. (IEEE Trans. Nucl. Sei., 2001) Nous avons proposĂ© une nouvelle technique linĂ©aire de reconstruction d'image Ă©vĂ©nement-par-Ă©vĂ©nement basĂ©e sur la matrice pseudo-inverse rĂ©gularisĂ©e du systĂšme. L'algorithme reprĂ©sente une façon rapide de mettre Ă  jour une image, potentiellement en temps rĂ©el, et permet, en principe, la visualisation instantanĂ©e de distribution de la radioactivitĂ© durant l'acquisition des donnĂ©es tomographiques. L'image ainsi calculĂ©e est la solution minimisant les moindres carrĂ©s du problĂšme inverse rĂ©gularisĂ©.Abstract: Ill-posed problems are a topic of an interdisciplinary interest arising in remote sensing and non-invasive imaging. However, there are issues crucial for successful application of the theory to a given imaging modality. Positron emission tomography (PET) is a non-invasive imaging technique that allows assessing biochemical processes taking place in an organism in vivo. PET is a valuable tool in investigation of normal human or animal physiology, diagnosing and staging cancer, heart and brain disorders. PET is similar to other tomographie imaging techniques in many ways, but to reach its full potential and to extract maximum information from projection data, PET has to use accurate, yet practical, image reconstruction algorithms. Several topics related to PET image reconstruction have been explored in the present dissertation. The following contributions have been made: (1) A system matrix model has been developed using an analytic detector response function based on linear attenuation of [gamma]-rays in a detector array. It has been demonstrated that the use of an oversimplified system model for the computation of a system matrix results in image artefacts. (IEEE Trans. Nucl. Sci., 2000); (2) The dependence on total counts modelled analytically was used to simplify utilisation of the cross-validation (CV) stopping rule and accelerate statistical iterative reconstruction. It can be utilised instead of the original CV procedure for high-count projection data, when the CV yields reasonably accurate images. (IEEE Trans. Nucl. Sci., 2001); (3) A regularisation methodology employing singular value decomposition (SVD) of the system matrix was proposed based on the spatial resolution analysis. A characteristic property of the singular value spectrum shape was found that revealed a relationship between the optimal truncation level to be used with the truncated SVD reconstruction and the optimal reconstructed image resolution. (IEEE Trans. Nucl. Sci., 2001); (4) A novel event-by-event linear image reconstruction technique based on a regularised pseudo-inverse of the system matrix was proposed. The algorithm provides a fast way to update an image potentially in real time and allows, in principle, for the instant visualisation of the radioactivity distribution while the object is still being scanned. The computed image estimate is the minimum-norm least-squares solution of the regularised inverse problem

    A nonparametric procedure for comparing the areas under correlated LROC curves

    Get PDF
    pre-printIn contrast to the receiver operating characteristic (ROC) assessment paradigm, localization ROC (LROC) analysis provides a means to jointly assess the accuracy of localization and detection in an observer study. In a typical multireader, multicase (MRMC) evaluation, the data sets are paired so that correlations arise in observer performance both between readers and across the imaging conditions (e.g., reconstruction methods or scanning parameters) being compared. Therefore, MRMC evaluations motivate the need for a statistical methodology to compare correlated LROC curves. In this paper, we suggest a nonparametric strategy for this purpose. Specifically, we find that seminal work of Sen on U-statistics can be applied to estimate the covariance matrix for a vector of LROC area estimates. The resulting covariance estimator is the LROC analog of the covariance estimator given by DeLong et al. for ROC analysis. Once the covariance matrix is estimated, it can be used to construct confidence intervals and/or confidence regions for purposes of comparing observer performance across imaging conditions. In addition, given the results of a small-scale pilot study, the covariance estimator may be used to estimate the number of images and observers needed to achieve a desired confidence interval size in a full-scale observer study. The utility of our methodology is illustrated with a human-observer LROC evaluation of three image reconstruction strategies for fan-beam X-ray computed tomography
    • 

    corecore