133 research outputs found

    Minimax Emission Computed Tomography using High-Resolution Anatomical Side Information and B-Spline Models

    Full text link
    In this paper a minimax methodology is presented for combining information from two imaging modalities having different intrinsic spatial resolutions. The focus application is emission computed tomography (ECT), a low-resolution modality for reconstruction of radionuclide tracer density, when supplemented by high-resolution anatomical boundary information extracted from a magnetic resonance image (MRI) of the same imaging volume. The MRI boundary within the two-dimensional (2-D) slice of interest is parameterized by a closed planar curve. The Cramer-Rao (CR) lower bound is used to analyze estimation errors for different boundary shapes. Under a spatially inhomogeneous Gibbs field model for the tracer density a representation for the minimax MRI-enhanced tracer density estimator is obtained. It is shown that the estimator is asymptotically equivalent to a penalized maximum likelihood (PML) estimator with resolution-selective Gibbs penalty. Quantitative comparisons are presented using the iterative space alternating generalized expectation maximization (SAGE-FM) algorithm to implement the PML estimator with and without minimax weight averaging.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85822/1/Fessler86.pd

    Quantitative Image Reconstruction Methods for Low Signal-To-Noise Ratio Emission Tomography

    Full text link
    Novel internal radionuclide therapies such as radioembolization (RE) with Y-90 loaded microspheres and targeted therapies labeled with Lu-177 offer a unique promise for personalized treatment of cancer because imaging-based pre-treatment dosimetry assessment can be used to determine administered activities, which deliver tumoricidal absorbed doses to lesions while sparing critical organs. At present, however, such therapies are administered with fixed or empiric activities with little or no dosimetry planning. The main reason for lack of dosimetry guided personalized treatment in radionuclide therapies is the challenges and impracticality of quantitative emission tomography imaging and the lack of well established dose-effect relationships, potentially due to inaccuracies in quantitative imaging. While radionuclides for therapy have been chosen for their attractive characteristics for cancer treatment, their suitability for emission tomography imaging is less than ideal. For example, imaging of the almost pure beta emitter, Y-90, involves SPECT via bremsstrahlung photons that have a low and tissue dependent yield or PET via a very low abundance positron emission (32 out of 1 million decays) that leads to a very low true coincidence-rate in the presence of high singles events from bremsstrahlung photons. Lu-177 emits gamma-rays suitable for SPECT, but they are low in intensity (113 keV: 6%, 208 keV: 10%), and only the higher energy emission is generally used because of the large downscatter component associated with the lower energy gamma-ray. The main aim of the research in this thesis is to improve accuracy of quantitative PET and SPECT imaging of therapy radionuclides for dosimetry applications. Although PET is generally considered as superior to SPECT for quantitative imaging, PET imaging of `non-pure' positron emitters can be complex. We focus on quantitative SPECT and PET imaging of two widely used therapy radionuclides, Lu-177 and Y-90, that have challenges associated with low count-rates. The long term goal of our work is to apply the methods we develop to patient imaging for dosimetry based planning to optimize the treatment either before therapy or after each cycle of therapy. For Y-90 PET/CT, we developed an image reconstruction formulation that relaxes the conventional image-domain nonnegativity constraint by instead imposing a positivity constraint on the predicted measurement mean that demonstrated improved quantification in simulated patient studies. For Y-90 SPECT/CT, we propose a new SPECT/CT reconstruction formulation including tissue dependent probabilities for bremsstrahlung generation in the system matrix. In addition to above mentioned quantitative image reconstruction methods specifically developed for each modality in Y-90 imaging, we propose a general image reconstruction method using trained regularizer for low-count PET and SPECT that we test on Y-90 and Lu-177 imaging. Our approach starts with the raw projection data and utilizes trained networks in the iterative image formation process. Specifically, we take a mathematics-based approach where we include convolutional neural networks within the iterative reconstruction process arising from an optimization problem. We further extend the trained regularization method by using anatomical side information. The trained regularizer incorporates the anatomical information using the segmentation mask generated by a trained segmentation network where its input is the co-registered CT image. Overall, the emission tomography methods we have proposed in this work are expected to enhance low-count PET and SPECT imaging of therapy radionuclides in patient studies, which will have value in establishing dose – response relationships and developing imaging based dosimetry guided treatment planning strategies in the future.PHDElectrical and Computer EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/155171/1/hongki_1.pd

    Data harmonization in PET imaging

    Get PDF
    Medical imaging physics has advanced a lot in recent years, providing clinicians and researchers with increasingly detailed images that are well suited to be analyzed with a quantitative approach typical of hard sciences, based on measurements and analysis of clinical interest quantities extracted from images themselves. Such an approach is placed in the context of quantitative imaging. The possibility of sharing data quickly, the development of machine learning and data mining techniques, the increasing availability of computational power and digital data storage which characterize this age constitute a great opportunity for quantitative imaging studies. The interest in large multicentric databases that gather images from single research centers is growing year after year. Big datasets offer very interesting research perspectives, primarily because they allow to increase statistical power of studies. At the same time, they raised a compatibility issue between data themselves. Indeed images acquired with different scanners and protocols could be very different about quality and measures extracted from images with different quality might be not compatible with each other. Harmonization techniques have been developed to circumvent this problem. Harmonization refers to all efforts to combine data from different sources and provide users with a comparable view of data from different studies. Harmonization can be done before acquiring data, by choosing a-priori appropriate acquisition protocols through a preliminary joint effort between research centers, or it can be done a-posteriori i.e. images are grouped into a single dataset and then any effects on measures caused by technical acquisition factors are removed. Although the a-priori harmonization guarantees best results, it is not often used for practical and/or technical reasons. In this thesis I will focus on a-posteriori harmonization. It is important to note that when we consider multicentric studies, in addition to the technical variability related to scanners and acquisition protocols, there may be a demographic variability that makes single centers samples not statistically equivalent to each other. The wide individual variability that characterize human beings, even more pronounced when patients are enrolled from very different geographical areas, can certainly exacerbate this issue. In addition, we must consider that biological processes are complex phenomena: quantitative imaging measures can be affected by numerous confounding demographic variables even apparently unrelated to measures themselves. A good harmonization method should be able to preserve inter-individual variability and remove at the same time all the effects due acquisition technical factors. Heterogene ity in acquisition together with a great inter-individual variability make harmonization very hard to achieve. Harmonization methods currently used in literature are able to preserve only the inter-subjects variability described by a set of known confounding variables, while all the unknown confounding variables are wrongly removed. This might lead to incorrect harmonization, especially if the unknown confounders play an important role. This issue is emphasized in practice, as sometimes happens that demographic variables that are known to play a major role are unknown. The final goal of my thesis is a proposal for an harmonization method developed in the context of amyloid Positron Emission Tomography (PET) which aim to remove the effects of variability induced by technical factors and at the same time are able to keep all the inter-individual differences. Since knowing all the demographic confounders is almost impossible, both practically and a theoretically, my proposal does not require the knowledge of these variables. The main point is to characterize image quality through a set of quality measures evaluated in regions of interest (ROIs) which are required to be as independent as possible from anatomical and clinical variability in order to exclusively highlight the effect of technical factors on images texture. Ideally, this allows to decouple the between-subjects variability from the technical ones: the latter can be directly removed while the former is automatically preserved. Specifically, I defined and validated 3 quality measures based on images texture properties. In addition I used a quality metric already existing, and I considered the reconstruction matrix dimension to take into account image resolution. My work has been performed using a multicentric dataset consisting of 1001 amyloid PET images. Before dealing specifically with harmonization, I handled some important issues: I built a relational database to organize and manage data and then I developed an automated algorithm for images pre-processing to achieve registration and quantification. This work might also be used in other imaging contexts: in particular I believe it could be applied in fluorodeoxyglucose (FDG) PET and tau PET. The consequences of harmonization I developed have been explored at a preliminary level. My proposal should be considered as a starting point as I mainly dealt with the issues of quality measures, while the harmonization of the variables in itself was done with a linear regression model. Although harmonization through linear models is often used, more sophisticated techniques are present in literature. It would be interesting to combine them with my work. Further investigations would be desirable in future
    • …
    corecore