696 research outputs found

    REGULARIZATION PARAMETER SELECTION METHODS FOR ILL POSED POISSON IMAGING PROBLEMS

    Get PDF
    A common problem in imaging science is to estimate some underlying true image given noisy measurements of image intensity. When image intensity is measured by the counting of incident photons emitted by the object of interest, the data-noise is accurately modeled by a Poisson distribution, which motivates the use of Poisson maximum likelihood estimation. When the underlying model equation is ill-posed, regularization must be employed. I will present a computational framework for solving such problems, including statistically motivated methods for choosing the regularization parameter. Numerical examples will be included

    ํ•ด๋ถ€ํ•™์  ์œ ๋„ PET ์žฌ๊ตฌ์„ฑ: ๋งค๋„๋Ÿฝ์ง€ ์•Š์€ ์‚ฌ์ „ ํ•จ์ˆ˜๋ถ€ํ„ฐ ๋”ฅ๋Ÿฌ๋‹ ์ ‘๊ทผ๊นŒ์ง€

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์˜๊ณผ๋Œ€ํ•™ ์˜๊ณผํ•™๊ณผ, 2021. 2. ์ด์žฌ์„ฑ.Advances in simultaneous positron emission tomography/magnetic resonance imaging (PET/MRI) technology have led to an active investigation of the anatomy-guided regularized PET image reconstruction algorithm based on MR images. Among the various priors proposed for anatomy-guided regularized PET image reconstruction, Bowsherโ€™s method based on second-order smoothing priors sometimes suffers from over-smoothing of detailed structures. Therefore, in this study, we propose a Bowsher prior based on the l1 norm and an iteratively reweighting scheme to overcome the limitation of the original Bowsher method. In addition, we have derived a closed solution for iterative image reconstruction based on this non-smooth prior. A comparison study between the original l2 and proposed l1 Bowsher priors were conducted using computer simulation and real human data. In the simulation and real data application, small lesions with abnormal PET uptake were better detected by the proposed l1 Bowsher prior methods than the original Bowsher prior. The original l2 Bowsher leads to a decreased PET intensity in small lesions when there is no clear separation between the lesions and surrounding tissue in the anatomical prior. However, the proposed l1 Bowsher prior methods showed better contrast between the tumors and surrounding tissues owing to the intrinsic edge-preserving property of the prior which is attributed to the sparseness induced by l1 norm, especially in the iterative reweighting scheme. Besides, the proposed methods demonstrated lower bias and less hyper-parameter dependency on PET intensity estimation in the regions with matched anatomical boundaries in PET and MRI. Moreover, based on the formulation of l1 Bowsher prior, the unrolled network containing the conventional maximum-likelihood expectation-maximization (ML-EM) module was also proposed. The convolutional layers successfully learned the distribution of anatomically-guided PET images and the EM module corrected the intermediate outputs by comparing them with sinograms. The proposed unrolled network showed better performance than ordinary U-Net, where the regional uptake is less biased and deviated. Therefore, these methods will help improve the PET image quality based on the anatomical side information.์–‘์ „์ž๋ฐฉ์ถœ๋‹จ์ธต์ดฌ์˜ / ์ž๊ธฐ๊ณต๋ช…์˜์ƒ (PET/MRI) ๋™์‹œ ํš๋“ ๊ธฐ์ˆ ์˜ ๋ฐœ์ „์œผ๋กœ MR ์˜์ƒ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ํ•ด๋ถ€ํ•™์  ์‚ฌ์ „ ํ•จ์ˆ˜๋กœ ์ •๊ทœํ™” ๋œ PET ์˜์ƒ ์žฌ๊ตฌ์„ฑ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ๋Œ€ํ•œ ์‹ฌ๋„์žˆ๋Š” ํ‰๊ฐ€๊ฐ€ ์ด๋ฃจ์–ด์กŒ๋‹ค. ํ•ด๋ถ€ํ•™ ๊ธฐ๋ฐ˜์œผ๋กœ ์ •๊ทœํ™” ๋œ PET ์ด๋ฏธ์ง€ ์žฌ๊ตฌ์„ฑ์„ ์œ„ํ•ด ์ œ์•ˆ ๋œ ๋‹ค์–‘ํ•œ ์‚ฌ์ „ ์ค‘ 2์ฐจ ํ‰ํ™œํ™” ์‚ฌ์ „ํ•จ์ˆ˜์— ๊ธฐ๋ฐ˜ํ•œ Bowsher์˜ ๋ฐฉ๋ฒ•์€ ๋•Œ๋•Œ๋กœ ์„ธ๋ถ€ ๊ตฌ์กฐ์˜ ๊ณผ๋„ํ•œ ํ‰ํ™œํ™”๋กœ ์–ด๋ ค์›€์„ ๊ฒช๋Š”๋‹ค. ๋”ฐ๋ผ์„œ ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ์›๋ž˜ Bowsher ๋ฐฉ๋ฒ•์˜ ํ•œ๊ณ„๋ฅผ ๊ทน๋ณตํ•˜๊ธฐ ์œ„ํ•ด l1 norm์— ๊ธฐ๋ฐ˜ํ•œ Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜์™€ ๋ฐ˜๋ณต์ ์ธ ์žฌ๊ฐ€์ค‘์น˜ ๊ธฐ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ๋˜ํ•œ, ์šฐ๋ฆฌ๋Š” ์ด ๋งค๋„๋Ÿฝ์ง€ ์•Š์€ ์‚ฌ์ „ ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•œ ๋ฐ˜๋ณต์  ์ด๋ฏธ์ง€ ์žฌ๊ตฌ์„ฑ์— ๋Œ€ํ•ด ๋‹ซํžŒ ํ•ด๋ฅผ ๋„์ถœํ–ˆ๋‹ค. ์›๋ž˜ l2์™€ ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜ ๊ฐ„์˜ ๋น„๊ต ์—ฐ๊ตฌ๋Š” ์ปดํ“จํ„ฐ ์‹œ๋ฎฌ๋ ˆ์ด์…˜๊ณผ ์‹ค์ œ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ˆ˜ํ–‰๋˜์—ˆ๋‹ค. ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ๋ฐ ์‹ค์ œ ๋ฐ์ดํ„ฐ์—์„œ ๋น„์ •์ƒ์ ์ธ PET ํก์ˆ˜๋ฅผ ๊ฐ€์ง„ ์ž‘์€ ๋ณ‘๋ณ€์€ ์›๋ž˜ Bowsher ์ด์ „๋ณด๋‹ค ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ๋ฐฉ๋ฒ•์œผ๋กœ ๋” ์ž˜ ๊ฐ์ง€๋˜์—ˆ๋‹ค. ์›๋ž˜์˜ l2 Bowsher๋Š” ํ•ด๋ถ€ํ•™์  ์˜์ƒ์—์„œ ๋ณ‘๋ณ€๊ณผ ์ฃผ๋ณ€ ์กฐ์ง ์‚ฌ์ด์— ๋ช…ํ™•ํ•œ ๋ถ„๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ ์ž‘์€ ๋ณ‘๋ณ€์—์„œ์˜ PET ๊ฐ•๋„๋ฅผ ๊ฐ์†Œ์‹œํ‚จ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ๋ฐฉ๋ฒ•์€ ํŠนํžˆ ๋ฐ˜๋ณต์  ์žฌ๊ฐ€์ค‘์น˜ ๊ธฐ๋ฒ•์—์„œ l1 ๋…ธ๋ฆ„์— ์˜ํ•ด ์œ ๋„๋œ ํฌ์†Œ์„ฑ์— ๊ธฐ์ธํ•œ ํŠน์„ฑ์œผ๋กœ ์ธํ•ด ์ข…์–‘๊ณผ ์ฃผ๋ณ€ ์กฐ์ง ์‚ฌ์ด์— ๋” ๋‚˜์€ ๋Œ€๋น„๋ฅผ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋˜ํ•œ ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•์€ PET๊ณผ MRI์˜ ํ•ด๋ถ€ํ•™์  ๊ฒฝ๊ณ„๊ฐ€ ์ผ์น˜ํ•˜๋Š” ์˜์—ญ์—์„œ PET ๊ฐ•๋„ ์ถ”์ •์— ๋Œ€ํ•œ ํŽธํ–ฅ์ด ๋” ๋‚ฎ๊ณ  ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ ์ข…์†์„ฑ์ด ์ ์Œ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋˜ํ•œ, l1Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜์˜ ๋‹ซํžŒ ํ•ด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ธฐ์กด์˜ ML-EM (maximum-likelihood expectation-maximization) ๋ชจ๋“ˆ์„ ํฌํ•จํ•˜๋Š” ํŽผ์ณ์ง„ ๋„คํŠธ์›Œํฌ๋„ ์ œ์•ˆ๋˜์—ˆ๋‹ค. ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด๋Š” ํ•ด๋ถ€ํ•™์ ์œผ๋กœ ์œ ๋„ ์žฌ๊ตฌ์„ฑ๋œ PET ์ด๋ฏธ์ง€์˜ ๋ถ„ํฌ๋ฅผ ์„ฑ๊ณต์ ์œผ๋กœ ํ•™์Šตํ–ˆ์œผ๋ฉฐ, EM ๋ชจ๋“ˆ์€ ์ค‘๊ฐ„ ์ถœ๋ ฅ๋“ค์„ ์‚ฌ์ด๋…ธ๊ทธ๋žจ๊ณผ ๋น„๊ตํ•˜์—ฌ ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€๊ฐ€ ์ž˜ ๋“ค์–ด๋งž๊ฒŒ ์ˆ˜์ •ํ–ˆ๋‹ค. ์ œ์•ˆ๋œ ํŽผ์ณ์ง„ ๋„คํŠธ์›Œํฌ๋Š” ์ง€์—ญ์˜ ํก์ˆ˜์„ ๋Ÿ‰์ด ๋œ ํŽธํ–ฅ๋˜๊ณ  ํŽธ์ฐจ๊ฐ€ ์ ์–ด, ์ผ๋ฐ˜ U-Net๋ณด๋‹ค ๋” ๋‚˜์€ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋”ฐ๋ผ์„œ ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•๋“ค์€ ํ•ด๋ถ€ํ•™์  ์ •๋ณด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ PET ์ด๋ฏธ์ง€ ํ’ˆ์งˆ์„ ํ–ฅ์ƒ์‹œํ‚ค๋Š” ๋ฐ ์œ ์šฉํ•  ๊ฒƒ์ด๋‹ค.Chapter 1. Introduction 1 1.1. Backgrounds 1 1.1.1. Positron Emission Tomography 1 1.1.2. Maximum a Posterior Reconstruction 1 1.1.3. Anatomical Prior 2 1.1.4. Proposed l_1 Bowsher Prior 3 1.1.5. Deep Learning for MR-less Application 4 1.2. Purpose of the Research 4 Chapter 2. Anatomically-guided PET Reconstruction Using Bowsher Prior 6 2.1. Backgrounds 6 2.1.1. PET Data Model 6 2.1.2. Original Bowsher Prior 7 2.2. Methods and Materials 8 2.2.1. Proposed l_1 Bowsher Prior 8 2.2.2. Iterative Reweighting 13 2.2.3. Computer Simulations 15 2.2.4. Human Data 16 2.2.5. Image Analysis 17 2.3. Results 19 2.3.1. Simulation with Brain Phantom 19 2.3.2.Human Data 20 2.4. Discussions 25 Chapter 3. Deep Learning Approach for Anatomically-guided PET Reconstruction 31 3.1. Backgrounds 31 3.2. Methods and Materials 33 3.2.1. Douglas-Rachford Splitting 33 3.2.2. Network Architecture 34 3.2.3. Dataset and Training Details 35 3.2.4. Image Analysis 36 3.3. Results 37 3.4. Discussions 38 Chapter 4. Conclusions 40 Bibliography 41 Abstract in Korean (๊ตญ๋ฌธ ์ดˆ๋ก) 52Docto

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+ฮฑR(u)โ†’minโกu{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and ฮฑ\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or โ„“1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure

    Faster PET reconstruction with non-smooth priors by randomization and preconditioning

    Get PDF
    Uncompressed clinical data from modern positron emission tomography (PET) scanners are very large, exceeding 350 million data points (projection bins). The last decades have seen tremendous advancements in mathematical imaging tools many of which lead to non-smooth (i.e. non-differentiable) optimization problems which are much harder to solve than smooth optimization problems. Most of these tools have not been translated to clinical PET data, as the state-of-the-art algorithms for non-smooth problems do not scale well to large data. In this work, inspired by big data machine learning applications, we use advanced randomized optimization algorithms to solve the PET reconstruction problem for a very large class of non-smooth priors which includes for example total variation, total generalized variation, directional total variation and various different physical constraints. The proposed algorithm randomly uses subsets of the data and only updates the variables associated with these. While this idea often leads to divergent algorithms, we show that the proposed algorithm does indeed converge for any proper subset selection. Numerically, we show on real PET data (FDG and florbetapir) from a Siemens Biograph mMR that about ten projections and backprojections are sufficient to solve the MAP optimisation problem related to many popular non-smooth priors; thus showing that the proposed algorithm is fast enough to bring these models into routine clinical practice

    Development and implementation of efficient noise suppression methods for emission computed tomography

    Get PDF
    In PET and SPECT imaging, iterative reconstruction is now widely used due to its capability of incorporating into the reconstruction process a physics model and Bayesian statistics involved in photon detection. Iterative reconstruction methods rely on regularization terms to suppress image noise and render radiotracer distribution with good image quality. The choice of regularization method substantially affects the appearances of reconstructed images, and is thus a critical aspect of the reconstruction process. Major contributions of this work include implementation and evaluation of various new regularization methods. Previously, our group developed a preconditioned alternating projection algorithm (PAPA) to optimize the emission computed tomography (ECT) objective function with the non-differentiable total variation (TV) regularizer. The algorithm was modified to optimize the proposed reconstruction objective functions. First, two novel TV-based regularizersโ€”high-order total variation (HOTV) and infimal convolution total variation (ICTV)โ€”were proposed as alternative choices to the customary TV regularizer in SPECT reconstruction, to reduce โ€œstaircaseโ€ artifacts produced by TV. We have evaluated both proposed reconstruction methods (HOTV-PAPA and ICTV-PAPA), and compared them with the TV regularized reconstruction (TV-PAPA) and the clinical standard, Gaussian post-filtered, expectation-maximization reconstruction method (GPF-EM) using both Monte Carlo-simulated data and anonymized clinical data. Model-observer studies using Monte Carlo-simulated data indicate that ICTV-PAPA is able to reconstruct images with similar or better lesion detectability, compared with clinical standard GPF-EM methods, but at lower detected count levels. This implies that switching from GPF-EM to ICTV-PAPA can reduce patient dose while maintaining image quality for diagnostic use. Second, the 1 norm of discrete cosine transform (DCT)-induced framelet regularization was studied. We decomposed the image into high and low spatial-frequency components, and then preferentially penalized the high spatial-frequency components. The DCT-induced framelet transform of the natural radiotracer distribution image is sparse. By using this property, we were able to effectively suppress image noise without overly compromising spatial resolution or image contrast. Finally, the fractional norm of the first-order spatial gradient was introduced as a regularizer. We implemented 2/3 and 1/2 norms to suppress image spatial variability. Due to the strong penalty of small differences between neighboring pixels, fractional-norm regularizers suffer from similar cartoon-like artifacts as with the TV regularizer. However, when penalty weights are properly selected, fractional-norm regularizers outperform TV in terms of noise suppression and contrast recovery

    Some proximal methods for Poisson intensity CBCT and PET

    No full text
    International audienceCone-Beam Computerized Tomography (CBCT) and Positron Emission Tomography (PET) are two complementary medical imaging modalities providing respectively anatomic and metabolic information on a patient. In the context of public health, one must address the problem of dose reduction of the potentially harmful quantities related to each exam protocol : X-rays for CBCT and radiotracer for PET. Two demonstrators based on a technological breakthrough (acquisition devices work in photon-counting mode) have been developed. It turns out that in this low-dose context, i.e. for low intensity signals acquired by photon counting devices, noise should not be approximated anymore by a Gaussian distribution, but is following a Poisson distribution. We investigate in this paper the two related tomographic reconstruction problems. We formulate separately the CBCT and the PET problems in two general frameworks that encompass the physics of the acquisition devices and the specific discretization of the object to reconstruct. We propose various fast numerical schemes based on proximal methods to compute the solution of each problem. In particular, we show that primal-dual approaches are well suited in the PET case when considering non differentiable regularizations such as Total Variation. Experiments on numerical simulations and real data are in favor of the proposed algorithms when compared with well-established methods
    • โ€ฆ
    corecore