407 research outputs found

    ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration

    Full text link
    We propose a method, called ACQUIRE, for the solution of constrained optimization problems modeling the restoration of images corrupted by Poisson noise. The objective function is the sum of a generalized Kullback-Leibler divergence term and a TV regularizer, subject to nonnegativity and possibly other constraints, such as flux conservation. ACQUIRE is a line-search method that considers a smoothed version of TV, based on a Huber-like function, and computes the search directions by minimizing quadratic approximations of the problem, built by exploiting some second-order information. A classical second-order Taylor approximation is used for the Kullback-Leibler term and an iteratively reweighted norm approach for the smoothed TV term. We prove that the sequence generated by the method has a subsequence converging to a minimizer of the smoothed problem and any limit point is a minimizer. Furthermore, if the problem is strictly convex, the whole sequence is convergent. We note that convergence is achieved without requiring the exact minimization of the quadratic subproblems; low accuracy in this minimization can be used in practice, as shown by numerical results. Experiments on reference test problems show that our method is competitive with well-established methods for TV-based Poisson image restoration, in terms of both computational efficiency and image quality.Comment: 37 pages, 13 figure

    Advanced Denoising for X-ray Ptychography

    Get PDF
    The success of ptychographic imaging experiments strongly depends on achieving high signal-to-noise ratio. This is particularly important in nanoscale imaging experiments when diffraction signals are very weak and the experiments are accompanied by significant parasitic scattering (background), outliers or correlated noise sources. It is also critical when rare events such as cosmic rays, or bad frames caused by electronic glitches or shutter timing malfunction take place. In this paper, we propose a novel iterative algorithm with rigorous analysis that exploits the direct forward model for parasitic noise and sample smoothness to achieve a thorough characterization and removal of structured and random noise. We present a formal description of the proposed algorithm and prove its convergence under mild conditions. Numerical experiments from simulations and real data (both soft and hard X-ray beamlines) demonstrate that the proposed algorithms produce better results when compared to state-of-the-art methods.Comment: 24 pages, 9 figure

    Robust Non-Rigid Registration with Reweighted Position and Transformation Sparsity

    Get PDF
    Non-rigid registration is challenging because it is ill-posed with high degrees of freedom and is thus sensitive to noise and outliers. We propose a robust non-rigid registration method using reweighted sparsities on position and transformation to estimate the deformations between 3-D shapes. We formulate the energy function with position and transformation sparsity on both the data term and the smoothness term, and define the smoothness constraint using local rigidity. The double sparsity based non-rigid registration model is enhanced with a reweighting scheme, and solved by transferring the model into four alternately-optimized subproblems which have exact solutions and guaranteed convergence. Experimental results on both public datasets and real scanned datasets show that our method outperforms the state-of-the-art methods and is more robust to noise and outliers than conventional non-rigid registration methods.Comment: IEEE Transactions on Visualization and Computer Graphic

    ํ•ด๋ถ€ํ•™์  ์œ ๋„ PET ์žฌ๊ตฌ์„ฑ: ๋งค๋„๋Ÿฝ์ง€ ์•Š์€ ์‚ฌ์ „ ํ•จ์ˆ˜๋ถ€ํ„ฐ ๋”ฅ๋Ÿฌ๋‹ ์ ‘๊ทผ๊นŒ์ง€

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์˜๊ณผ๋Œ€ํ•™ ์˜๊ณผํ•™๊ณผ, 2021. 2. ์ด์žฌ์„ฑ.Advances in simultaneous positron emission tomography/magnetic resonance imaging (PET/MRI) technology have led to an active investigation of the anatomy-guided regularized PET image reconstruction algorithm based on MR images. Among the various priors proposed for anatomy-guided regularized PET image reconstruction, Bowsherโ€™s method based on second-order smoothing priors sometimes suffers from over-smoothing of detailed structures. Therefore, in this study, we propose a Bowsher prior based on the l1 norm and an iteratively reweighting scheme to overcome the limitation of the original Bowsher method. In addition, we have derived a closed solution for iterative image reconstruction based on this non-smooth prior. A comparison study between the original l2 and proposed l1 Bowsher priors were conducted using computer simulation and real human data. In the simulation and real data application, small lesions with abnormal PET uptake were better detected by the proposed l1 Bowsher prior methods than the original Bowsher prior. The original l2 Bowsher leads to a decreased PET intensity in small lesions when there is no clear separation between the lesions and surrounding tissue in the anatomical prior. However, the proposed l1 Bowsher prior methods showed better contrast between the tumors and surrounding tissues owing to the intrinsic edge-preserving property of the prior which is attributed to the sparseness induced by l1 norm, especially in the iterative reweighting scheme. Besides, the proposed methods demonstrated lower bias and less hyper-parameter dependency on PET intensity estimation in the regions with matched anatomical boundaries in PET and MRI. Moreover, based on the formulation of l1 Bowsher prior, the unrolled network containing the conventional maximum-likelihood expectation-maximization (ML-EM) module was also proposed. The convolutional layers successfully learned the distribution of anatomically-guided PET images and the EM module corrected the intermediate outputs by comparing them with sinograms. The proposed unrolled network showed better performance than ordinary U-Net, where the regional uptake is less biased and deviated. Therefore, these methods will help improve the PET image quality based on the anatomical side information.์–‘์ „์ž๋ฐฉ์ถœ๋‹จ์ธต์ดฌ์˜ / ์ž๊ธฐ๊ณต๋ช…์˜์ƒ (PET/MRI) ๋™์‹œ ํš๋“ ๊ธฐ์ˆ ์˜ ๋ฐœ์ „์œผ๋กœ MR ์˜์ƒ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ํ•ด๋ถ€ํ•™์  ์‚ฌ์ „ ํ•จ์ˆ˜๋กœ ์ •๊ทœํ™” ๋œ PET ์˜์ƒ ์žฌ๊ตฌ์„ฑ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ๋Œ€ํ•œ ์‹ฌ๋„์žˆ๋Š” ํ‰๊ฐ€๊ฐ€ ์ด๋ฃจ์–ด์กŒ๋‹ค. ํ•ด๋ถ€ํ•™ ๊ธฐ๋ฐ˜์œผ๋กœ ์ •๊ทœํ™” ๋œ PET ์ด๋ฏธ์ง€ ์žฌ๊ตฌ์„ฑ์„ ์œ„ํ•ด ์ œ์•ˆ ๋œ ๋‹ค์–‘ํ•œ ์‚ฌ์ „ ์ค‘ 2์ฐจ ํ‰ํ™œํ™” ์‚ฌ์ „ํ•จ์ˆ˜์— ๊ธฐ๋ฐ˜ํ•œ Bowsher์˜ ๋ฐฉ๋ฒ•์€ ๋•Œ๋•Œ๋กœ ์„ธ๋ถ€ ๊ตฌ์กฐ์˜ ๊ณผ๋„ํ•œ ํ‰ํ™œํ™”๋กœ ์–ด๋ ค์›€์„ ๊ฒช๋Š”๋‹ค. ๋”ฐ๋ผ์„œ ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ์›๋ž˜ Bowsher ๋ฐฉ๋ฒ•์˜ ํ•œ๊ณ„๋ฅผ ๊ทน๋ณตํ•˜๊ธฐ ์œ„ํ•ด l1 norm์— ๊ธฐ๋ฐ˜ํ•œ Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜์™€ ๋ฐ˜๋ณต์ ์ธ ์žฌ๊ฐ€์ค‘์น˜ ๊ธฐ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ๋˜ํ•œ, ์šฐ๋ฆฌ๋Š” ์ด ๋งค๋„๋Ÿฝ์ง€ ์•Š์€ ์‚ฌ์ „ ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•œ ๋ฐ˜๋ณต์  ์ด๋ฏธ์ง€ ์žฌ๊ตฌ์„ฑ์— ๋Œ€ํ•ด ๋‹ซํžŒ ํ•ด๋ฅผ ๋„์ถœํ–ˆ๋‹ค. ์›๋ž˜ l2์™€ ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜ ๊ฐ„์˜ ๋น„๊ต ์—ฐ๊ตฌ๋Š” ์ปดํ“จํ„ฐ ์‹œ๋ฎฌ๋ ˆ์ด์…˜๊ณผ ์‹ค์ œ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ˆ˜ํ–‰๋˜์—ˆ๋‹ค. ์‹œ๋ฎฌ๋ ˆ์ด์…˜ ๋ฐ ์‹ค์ œ ๋ฐ์ดํ„ฐ์—์„œ ๋น„์ •์ƒ์ ์ธ PET ํก์ˆ˜๋ฅผ ๊ฐ€์ง„ ์ž‘์€ ๋ณ‘๋ณ€์€ ์›๋ž˜ Bowsher ์ด์ „๋ณด๋‹ค ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ๋ฐฉ๋ฒ•์œผ๋กœ ๋” ์ž˜ ๊ฐ์ง€๋˜์—ˆ๋‹ค. ์›๋ž˜์˜ l2 Bowsher๋Š” ํ•ด๋ถ€ํ•™์  ์˜์ƒ์—์„œ ๋ณ‘๋ณ€๊ณผ ์ฃผ๋ณ€ ์กฐ์ง ์‚ฌ์ด์— ๋ช…ํ™•ํ•œ ๋ถ„๋ฆฌ๊ฐ€ ์—†์„ ๋•Œ ์ž‘์€ ๋ณ‘๋ณ€์—์„œ์˜ PET ๊ฐ•๋„๋ฅผ ๊ฐ์†Œ์‹œํ‚จ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ œ์•ˆ ๋œ l1 Bowsher ์‚ฌ์ „ ๋ฐฉ๋ฒ•์€ ํŠนํžˆ ๋ฐ˜๋ณต์  ์žฌ๊ฐ€์ค‘์น˜ ๊ธฐ๋ฒ•์—์„œ l1 ๋…ธ๋ฆ„์— ์˜ํ•ด ์œ ๋„๋œ ํฌ์†Œ์„ฑ์— ๊ธฐ์ธํ•œ ํŠน์„ฑ์œผ๋กœ ์ธํ•ด ์ข…์–‘๊ณผ ์ฃผ๋ณ€ ์กฐ์ง ์‚ฌ์ด์— ๋” ๋‚˜์€ ๋Œ€๋น„๋ฅผ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋˜ํ•œ ์ œ์•ˆ๋œ ๋ฐฉ๋ฒ•์€ PET๊ณผ MRI์˜ ํ•ด๋ถ€ํ•™์  ๊ฒฝ๊ณ„๊ฐ€ ์ผ์น˜ํ•˜๋Š” ์˜์—ญ์—์„œ PET ๊ฐ•๋„ ์ถ”์ •์— ๋Œ€ํ•œ ํŽธํ–ฅ์ด ๋” ๋‚ฎ๊ณ  ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ ์ข…์†์„ฑ์ด ์ ์Œ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋˜ํ•œ, l1Bowsher ์‚ฌ์ „ ํ•จ์ˆ˜์˜ ๋‹ซํžŒ ํ•ด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ธฐ์กด์˜ ML-EM (maximum-likelihood expectation-maximization) ๋ชจ๋“ˆ์„ ํฌํ•จํ•˜๋Š” ํŽผ์ณ์ง„ ๋„คํŠธ์›Œํฌ๋„ ์ œ์•ˆ๋˜์—ˆ๋‹ค. ์ปจ๋ณผ๋ฃจ์…˜ ๋ ˆ์ด์–ด๋Š” ํ•ด๋ถ€ํ•™์ ์œผ๋กœ ์œ ๋„ ์žฌ๊ตฌ์„ฑ๋œ PET ์ด๋ฏธ์ง€์˜ ๋ถ„ํฌ๋ฅผ ์„ฑ๊ณต์ ์œผ๋กœ ํ•™์Šตํ–ˆ์œผ๋ฉฐ, EM ๋ชจ๋“ˆ์€ ์ค‘๊ฐ„ ์ถœ๋ ฅ๋“ค์„ ์‚ฌ์ด๋…ธ๊ทธ๋žจ๊ณผ ๋น„๊ตํ•˜์—ฌ ๊ฒฐ๊ณผ ์ด๋ฏธ์ง€๊ฐ€ ์ž˜ ๋“ค์–ด๋งž๊ฒŒ ์ˆ˜์ •ํ–ˆ๋‹ค. ์ œ์•ˆ๋œ ํŽผ์ณ์ง„ ๋„คํŠธ์›Œํฌ๋Š” ์ง€์—ญ์˜ ํก์ˆ˜์„ ๋Ÿ‰์ด ๋œ ํŽธํ–ฅ๋˜๊ณ  ํŽธ์ฐจ๊ฐ€ ์ ์–ด, ์ผ๋ฐ˜ U-Net๋ณด๋‹ค ๋” ๋‚˜์€ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ๋”ฐ๋ผ์„œ ์ด๋Ÿฌํ•œ ๋ฐฉ๋ฒ•๋“ค์€ ํ•ด๋ถ€ํ•™์  ์ •๋ณด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ PET ์ด๋ฏธ์ง€ ํ’ˆ์งˆ์„ ํ–ฅ์ƒ์‹œํ‚ค๋Š” ๋ฐ ์œ ์šฉํ•  ๊ฒƒ์ด๋‹ค.Chapter 1. Introduction 1 1.1. Backgrounds 1 1.1.1. Positron Emission Tomography 1 1.1.2. Maximum a Posterior Reconstruction 1 1.1.3. Anatomical Prior 2 1.1.4. Proposed l_1 Bowsher Prior 3 1.1.5. Deep Learning for MR-less Application 4 1.2. Purpose of the Research 4 Chapter 2. Anatomically-guided PET Reconstruction Using Bowsher Prior 6 2.1. Backgrounds 6 2.1.1. PET Data Model 6 2.1.2. Original Bowsher Prior 7 2.2. Methods and Materials 8 2.2.1. Proposed l_1 Bowsher Prior 8 2.2.2. Iterative Reweighting 13 2.2.3. Computer Simulations 15 2.2.4. Human Data 16 2.2.5. Image Analysis 17 2.3. Results 19 2.3.1. Simulation with Brain Phantom 19 2.3.2.Human Data 20 2.4. Discussions 25 Chapter 3. Deep Learning Approach for Anatomically-guided PET Reconstruction 31 3.1. Backgrounds 31 3.2. Methods and Materials 33 3.2.1. Douglas-Rachford Splitting 33 3.2.2. Network Architecture 34 3.2.3. Dataset and Training Details 35 3.2.4. Image Analysis 36 3.3. Results 37 3.4. Discussions 38 Chapter 4. Conclusions 40 Bibliography 41 Abstract in Korean (๊ตญ๋ฌธ ์ดˆ๋ก) 52Docto

    Convolutional Deblurring for Natural Imaging

    Full text link
    In this paper, we propose a novel design of image deblurring in the form of one-shot convolution filtering that can directly convolve with naturally blurred images for restoration. The problem of optical blurring is a common disadvantage to many imaging applications that suffer from optical imperfections. Despite numerous deconvolution methods that blindly estimate blurring in either inclusive or exclusive forms, they are practically challenging due to high computational cost and low image reconstruction quality. Both conditions of high accuracy and high speed are prerequisites for high-throughput imaging platforms in digital archiving. In such platforms, deblurring is required after image acquisition before being stored, previewed, or processed for high-level interpretation. Therefore, on-the-fly correction of such images is important to avoid possible time delays, mitigate computational expenses, and increase image perception quality. We bridge this gap by synthesizing a deconvolution kernel as a linear combination of Finite Impulse Response (FIR) even-derivative filters that can be directly convolved with blurry input images to boost the frequency fall-off of the Point Spread Function (PSF) associated with the optical blur. We employ a Gaussian low-pass filter to decouple the image denoising problem for image edge deblurring. Furthermore, we propose a blind approach to estimate the PSF statistics for two Gaussian and Laplacian models that are common in many imaging pipelines. Thorough experiments are designed to test and validate the efficiency of the proposed method using 2054 naturally blurred images across six imaging applications and seven state-of-the-art deconvolution methods.Comment: 15 pages, for publication in IEEE Transaction Image Processin
    • โ€ฆ
    corecore