734 research outputs found
Task adapted reconstruction for inverse problems
The paper considers the problem of performing a task defined on a model
parameter that is only observed indirectly through noisy data in an ill-posed
inverse problem. A key aspect is to formalize the steps of reconstruction and
task as appropriate estimators (non-randomized decision rules) in statistical
estimation problems. The implementation makes use of (deep) neural networks to
provide a differentiable parametrization of the family of estimators for both
steps. These networks are combined and jointly trained against suitable
supervised training data in order to minimize a joint differentiable loss
function, resulting in an end-to-end task adapted reconstruction method. The
suggested framework is generic, yet adaptable, with a plug-and-play structure
for adjusting both the inverse problem and the task at hand. More precisely,
the data model (forward operator and statistical model of the noise) associated
with the inverse problem is exchangeable, e.g., by using neural network
architecture given by a learned iterative method. Furthermore, any task that is
encodable as a trainable neural network can be used. The approach is
demonstrated on joint tomographic image reconstruction, classification and
joint tomographic image reconstruction segmentation
ํด๋ถํ์ ์ ๋ PET ์ฌ๊ตฌ์ฑ: ๋งค๋๋ฝ์ง ์์ ์ฌ์ ํจ์๋ถํฐ ๋ฅ๋ฌ๋ ์ ๊ทผ๊น์ง
ํ์๋
ผ๋ฌธ (๋ฐ์ฌ) -- ์์ธ๋ํ๊ต ๋ํ์ : ์๊ณผ๋ํ ์๊ณผํ๊ณผ, 2021. 2. ์ด์ฌ์ฑ.Advances in simultaneous positron emission tomography/magnetic resonance imaging (PET/MRI) technology have led to an active investigation of the anatomy-guided regularized PET image reconstruction algorithm based on MR images. Among the various priors proposed for anatomy-guided regularized PET image reconstruction, Bowsherโs method based on second-order smoothing priors sometimes suffers from over-smoothing of detailed structures. Therefore, in this study, we propose a Bowsher prior based on the l1 norm and an iteratively reweighting scheme to overcome the limitation of the original Bowsher method. In addition, we have derived a closed solution for iterative image reconstruction based on this non-smooth prior. A comparison study between the original l2 and proposed l1 Bowsher priors were conducted using computer simulation and real human data. In the simulation and real data application, small lesions with abnormal PET uptake were better detected by the proposed l1 Bowsher prior methods than the original Bowsher prior. The original l2 Bowsher leads to a decreased PET intensity in small lesions when there is no clear separation between the lesions and surrounding tissue in the anatomical prior. However, the proposed l1 Bowsher prior methods showed better contrast between the tumors and surrounding tissues owing to the intrinsic edge-preserving property of the prior which is attributed to the sparseness induced by l1 norm, especially in the iterative reweighting scheme. Besides, the proposed methods demonstrated lower bias and less hyper-parameter dependency on PET intensity estimation in the regions with matched anatomical boundaries in PET and MRI.
Moreover, based on the formulation of l1 Bowsher prior, the unrolled network containing the conventional maximum-likelihood expectation-maximization (ML-EM) module was also proposed. The convolutional layers successfully learned the distribution of anatomically-guided PET images and the EM module corrected the intermediate outputs by comparing them with sinograms. The proposed unrolled network showed better performance than ordinary U-Net, where the regional uptake is less biased and deviated. Therefore, these methods will help improve the PET image quality based on the anatomical side information.์์ ์๋ฐฉ์ถ๋จ์ธต์ดฌ์ / ์๊ธฐ๊ณต๋ช
์์ (PET/MRI) ๋์ ํ๋ ๊ธฐ์ ์ ๋ฐ์ ์ผ๋ก MR ์์์ ๊ธฐ๋ฐ์ผ๋ก ํ ํด๋ถํ์ ์ฌ์ ํจ์๋ก ์ ๊ทํ ๋ PET ์์ ์ฌ๊ตฌ์ฑ ์๊ณ ๋ฆฌ์ฆ์ ๋ํ ์ฌ๋์๋ ํ๊ฐ๊ฐ ์ด๋ฃจ์ด์ก๋ค. ํด๋ถํ ๊ธฐ๋ฐ์ผ๋ก ์ ๊ทํ ๋ PET ์ด๋ฏธ์ง ์ฌ๊ตฌ์ฑ์ ์ํด ์ ์ ๋ ๋ค์ํ ์ฌ์ ์ค 2์ฐจ ํํํ ์ฌ์ ํจ์์ ๊ธฐ๋ฐํ Bowsher์ ๋ฐฉ๋ฒ์ ๋๋๋ก ์ธ๋ถ ๊ตฌ์กฐ์ ๊ณผ๋ํ ํํํ๋ก ์ด๋ ค์์ ๊ฒช๋๋ค. ๋ฐ๋ผ์ ๋ณธ ์ฐ๊ตฌ์์๋ ์๋ Bowsher ๋ฐฉ๋ฒ์ ํ๊ณ๋ฅผ ๊ทน๋ณตํ๊ธฐ ์ํด l1 norm์ ๊ธฐ๋ฐํ Bowsher ์ฌ์ ํจ์์ ๋ฐ๋ณต์ ์ธ ์ฌ๊ฐ์ค์น ๊ธฐ๋ฒ์ ์ ์ํ๋ค. ๋ํ, ์ฐ๋ฆฌ๋ ์ด ๋งค๋๋ฝ์ง ์์ ์ฌ์ ํจ์๋ฅผ ์ด์ฉํ ๋ฐ๋ณต์ ์ด๋ฏธ์ง ์ฌ๊ตฌ์ฑ์ ๋ํด ๋ซํ ํด๋ฅผ ๋์ถํ๋ค. ์๋ l2์ ์ ์ ๋ l1 Bowsher ์ฌ์ ํจ์ ๊ฐ์ ๋น๊ต ์ฐ๊ตฌ๋ ์ปดํจํฐ ์๋ฎฌ๋ ์ด์
๊ณผ ์ค์ ๋ฐ์ดํฐ๋ฅผ ์ฌ์ฉํ์ฌ ์ํ๋์๋ค. ์๋ฎฌ๋ ์ด์
๋ฐ ์ค์ ๋ฐ์ดํฐ์์ ๋น์ ์์ ์ธ PET ํก์๋ฅผ ๊ฐ์ง ์์ ๋ณ๋ณ์ ์๋ Bowsher ์ด์ ๋ณด๋ค ์ ์ ๋ l1 Bowsher ์ฌ์ ๋ฐฉ๋ฒ์ผ๋ก ๋ ์ ๊ฐ์ง๋์๋ค. ์๋์ l2 Bowsher๋ ํด๋ถํ์ ์์์์ ๋ณ๋ณ๊ณผ ์ฃผ๋ณ ์กฐ์ง ์ฌ์ด์ ๋ช
ํํ ๋ถ๋ฆฌ๊ฐ ์์ ๋ ์์ ๋ณ๋ณ์์์ PET ๊ฐ๋๋ฅผ ๊ฐ์์ํจ๋ค. ๊ทธ๋ฌ๋ ์ ์ ๋ l1 Bowsher ์ฌ์ ๋ฐฉ๋ฒ์ ํนํ ๋ฐ๋ณต์ ์ฌ๊ฐ์ค์น ๊ธฐ๋ฒ์์ l1 ๋
ธ๋ฆ์ ์ํด ์ ๋๋ ํฌ์์ฑ์ ๊ธฐ์ธํ ํน์ฑ์ผ๋ก ์ธํด ์ข
์๊ณผ ์ฃผ๋ณ ์กฐ์ง ์ฌ์ด์ ๋ ๋์ ๋๋น๋ฅผ ๋ณด์ฌ์ฃผ์๋ค. ๋ํ ์ ์๋ ๋ฐฉ๋ฒ์ PET๊ณผ MRI์ ํด๋ถํ์ ๊ฒฝ๊ณ๊ฐ ์ผ์นํ๋ ์์ญ์์ PET ๊ฐ๋ ์ถ์ ์ ๋ํ ํธํฅ์ด ๋ ๋ฎ๊ณ ํ์ดํผ ํ๋ผ๋ฏธํฐ ์ข
์์ฑ์ด ์ ์์ ๋ณด์ฌ์ฃผ์๋ค.
๋ํ, l1Bowsher ์ฌ์ ํจ์์ ๋ซํ ํด๋ฅผ ๊ธฐ๋ฐ์ผ๋ก ๊ธฐ์กด์ ML-EM (maximum-likelihood expectation-maximization) ๋ชจ๋์ ํฌํจํ๋ ํผ์ณ์ง ๋คํธ์ํฌ๋ ์ ์๋์๋ค. ์ปจ๋ณผ๋ฃจ์
๋ ์ด์ด๋ ํด๋ถํ์ ์ผ๋ก ์ ๋ ์ฌ๊ตฌ์ฑ๋ PET ์ด๋ฏธ์ง์ ๋ถํฌ๋ฅผ ์ฑ๊ณต์ ์ผ๋ก ํ์ตํ์ผ๋ฉฐ, EM ๋ชจ๋์ ์ค๊ฐ ์ถ๋ ฅ๋ค์ ์ฌ์ด๋
ธ๊ทธ๋จ๊ณผ ๋น๊ตํ์ฌ ๊ฒฐ๊ณผ ์ด๋ฏธ์ง๊ฐ ์ ๋ค์ด๋ง๊ฒ ์์ ํ๋ค. ์ ์๋ ํผ์ณ์ง ๋คํธ์ํฌ๋ ์ง์ญ์ ํก์์ ๋์ด ๋ ํธํฅ๋๊ณ ํธ์ฐจ๊ฐ ์ ์ด, ์ผ๋ฐ U-Net๋ณด๋ค ๋ ๋์ ์ฑ๋ฅ์ ๋ณด์ฌ์ฃผ์๋ค. ๋ฐ๋ผ์ ์ด๋ฌํ ๋ฐฉ๋ฒ๋ค์ ํด๋ถํ์ ์ ๋ณด๋ฅผ ๊ธฐ๋ฐ์ผ๋ก PET ์ด๋ฏธ์ง ํ์ง์ ํฅ์์ํค๋ ๋ฐ ์ ์ฉํ ๊ฒ์ด๋ค.Chapter 1. Introduction 1
1.1. Backgrounds 1
1.1.1. Positron Emission Tomography 1
1.1.2. Maximum a Posterior Reconstruction 1
1.1.3. Anatomical Prior 2
1.1.4. Proposed l_1 Bowsher Prior 3
1.1.5. Deep Learning for MR-less Application 4
1.2. Purpose of the Research 4
Chapter 2. Anatomically-guided PET Reconstruction Using Bowsher Prior 6
2.1. Backgrounds 6
2.1.1. PET Data Model 6
2.1.2. Original Bowsher Prior 7
2.2. Methods and Materials 8
2.2.1. Proposed l_1 Bowsher Prior 8
2.2.2. Iterative Reweighting 13
2.2.3. Computer Simulations 15
2.2.4. Human Data 16
2.2.5. Image Analysis 17
2.3. Results 19
2.3.1. Simulation with Brain Phantom 19
2.3.2.Human Data 20
2.4. Discussions 25
Chapter 3. Deep Learning Approach for Anatomically-guided PET Reconstruction 31
3.1. Backgrounds 31
3.2. Methods and Materials 33
3.2.1. Douglas-Rachford Splitting 33
3.2.2. Network Architecture 34
3.2.3. Dataset and Training Details 35
3.2.4. Image Analysis 36
3.3. Results 37
3.4. Discussions 38
Chapter 4. Conclusions 40
Bibliography 41
Abstract in Korean (๊ตญ๋ฌธ ์ด๋ก) 52Docto
Convolutional Deblurring for Natural Imaging
In this paper, we propose a novel design of image deblurring in the form of
one-shot convolution filtering that can directly convolve with naturally
blurred images for restoration. The problem of optical blurring is a common
disadvantage to many imaging applications that suffer from optical
imperfections. Despite numerous deconvolution methods that blindly estimate
blurring in either inclusive or exclusive forms, they are practically
challenging due to high computational cost and low image reconstruction
quality. Both conditions of high accuracy and high speed are prerequisites for
high-throughput imaging platforms in digital archiving. In such platforms,
deblurring is required after image acquisition before being stored, previewed,
or processed for high-level interpretation. Therefore, on-the-fly correction of
such images is important to avoid possible time delays, mitigate computational
expenses, and increase image perception quality. We bridge this gap by
synthesizing a deconvolution kernel as a linear combination of Finite Impulse
Response (FIR) even-derivative filters that can be directly convolved with
blurry input images to boost the frequency fall-off of the Point Spread
Function (PSF) associated with the optical blur. We employ a Gaussian low-pass
filter to decouple the image denoising problem for image edge deblurring.
Furthermore, we propose a blind approach to estimate the PSF statistics for two
Gaussian and Laplacian models that are common in many imaging pipelines.
Thorough experiments are designed to test and validate the efficiency of the
proposed method using 2054 naturally blurred images across six imaging
applications and seven state-of-the-art deconvolution methods.Comment: 15 pages, for publication in IEEE Transaction Image Processin
DEQ-MPI: A Deep Equilibrium Reconstruction with Learned Consistency for Magnetic Particle Imaging
Magnetic particle imaging (MPI) offers unparalleled contrast and resolution
for tracing magnetic nanoparticles. A common imaging procedure calibrates a
system matrix (SM) that is used to reconstruct data from subsequent scans. The
ill-posed reconstruction problem can be solved by simultaneously enforcing data
consistency based on the SM and regularizing the solution based on an image
prior. Traditional hand-crafted priors cannot capture the complex attributes of
MPI images, whereas recent MPI methods based on learned priors can suffer from
extensive inference times or limited generalization performance. Here, we
introduce a novel physics-driven method for MPI reconstruction based on a deep
equilibrium model with learned data consistency (DEQ-MPI). DEQ-MPI reconstructs
images by augmenting neural networks into an iterative optimization, as
inspired by unrolling methods in deep learning. Yet, conventional unrolling
methods are computationally restricted to few iterations resulting in
non-convergent solutions, and they use hand-crafted consistency measures that
can yield suboptimal capture of the data distribution. DEQ-MPI instead trains
an implicit mapping to maximize the quality of a convergent solution, and it
incorporates a learned consistency measure to better account for the data
distribution. Demonstrations on simulated and experimental data indicate that
DEQ-MPI achieves superior image quality and competitive inference time to
state-of-the-art MPI reconstruction methods
- โฆ