130 research outputs found

    Variational models for multiplicative noise removal

    Get PDF
    ν•™μœ„λ…Όλ¬Έ (박사)-- μ„œμšΈλŒ€ν•™κ΅ λŒ€ν•™μ› μžμ—°κ³Όν•™λŒ€ν•™ μˆ˜λ¦¬κ³Όν•™λΆ€, 2017. 8. κ°•λͺ…μ£Ό.This dissertation discusses a variational partial differential equation (PDE) models for restoration of images corrupted by multiplicative Gamma noise. The two proposed models are suitable for heavy multiplicative noise which is often seen in applications. First, we propose a total variation (TV) based model with local constraints. The local constraint involves multiple local windows which is related a spatially adaptive regularization parameter (SARP). In addition, convergence analysis such as the existence and uniqueness of a solution is also provided. Second model is an extension of the first one using nonconvex version of the total generalized variation (TGV). The nonconvex TGV regularization enables to efficiently denoise smooth regions, without staircasing artifacts that appear on total variation regularization based models, and to conserve edges and details.1. Introduction 1 2. Previous works 6 2.1 Variational models for image denoising 6 2.2.1 Convex and nonconvex regularizers 6 2.2.2 Variational models for multiplicative noise removal 8 2.2 Proximal linearized alternating direction method of multipliers 10 3. Proposed models 13 3.1 Proposed model 1 :exp TV model with SARP 13 3.1.1 Derivation of our model 13 3.1.2 Proposed TV model with local constraints 16 3.1.3 A SARP algorithm for solving model (3.1.16) 27 3.1.4 Numerical results 32 3.2 Proposed model 2 :exp NTGV model with SARP 51 3.2.1 Proposed NTGV model 51 3.2.2 Updating rule for Ξ»(x)\lambda(x) in (3.2.1) 52 3.2.3 Algorithm for solving the proposed model (3.2.1) 55 3.2.4 Numerical results 62 3.2.5 Selection of parameters 63 3.2.6 Image denoising 65 4. Conclusion 79Docto

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+Ξ±R(u)β†’min⁑u{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and Ξ±\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or β„“1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure

    Multiplicative Noise Removal: Nonlocal Low-Rank Model and Its Proximal Alternating Reweighted Minimization Algorithm

    Get PDF
    The goal of this paper is to develop a novel numerical method for efficient multiplicative noise removal. The nonlocal self-similarity of natural images implies that the matrices formed by their nonlocal similar patches are low-rank. By exploiting this low-rank prior with application to multiplicative noise removal, we propose a nonlocal low-rank model for this task and develop a proximal alternating reweighted minimization (PARM) algorithm to solve the optimization problem resulting from the model. Specifically, we utilize a generalized nonconvex surrogate of the rank function to regularize the patch matrices and develop a new nonlocal low-rank model, which is a nonconvex nonsmooth optimization problem having a patchwise data fidelity and a generalized nonlocal low-rank regularization term. To solve this optimization problem, we propose the PARM algorithm, which has a proximal alternating scheme with a reweighted approximation of its subproblem. A theoretical analysis of the proposed PARM algorithm is conducted to guarantee its global convergence to a critical point. Numerical experiments demonstrate that the proposed method for multiplicative noise removal significantly outperforms existing methods such as the benchmark SAR-BM3D method in terms of the visual quality of the denoised images, and the PSNR (the peak-signal-to-noise ratio) and SSIM (the structural similarity index measure) values

    Accelerated algorithms for linearly constrained convex minimization

    Get PDF
    ν•™μœ„λ…Όλ¬Έ (박사)-- μ„œμšΈλŒ€ν•™κ΅ λŒ€ν•™μ› : μˆ˜λ¦¬κ³Όν•™λΆ€, 2014. 2. κ°•λͺ…μ£Ό.μ„ ν˜• μ œν•œ 쑰건의 μˆ˜ν•™μ  μ΅œμ ν™”λŠ” λ‹€μ–‘ν•œ μ˜μƒ 처리 문제의 λͺ¨λΈλ‘œμ„œ 사 용되고 μžˆλ‹€. 이 λ…Όλ¬Έμ—μ„œλŠ” 이 μ„ ν˜• μ œν•œ 쑰건의 μˆ˜ν•™μ  μ΅œμ ν™” 문제λ₯Ό ν’€κΈ°μœ„ν•œ λΉ λ₯Έ μ•Œκ³ λ¦¬λ“¬λ“€μ„ μ†Œκ°œν•˜κ³ μž ν•œλ‹€. μš°λ¦¬κ°€ μ œμ•ˆν•˜λŠ” 방법듀 은 κ³΅ν†΅μ μœΌλ‘œ Nesterov에 μ˜ν•΄μ„œ κ°œλ°œλ˜μ—ˆλ˜ κ°€μ†ν™”ν•œ ν”„λ‘μ‹œλ§ κ·Έλ ˆλ”” μ–ΈνŠΈ λ°©λ²•μ—μ„œ μ‚¬μš©λ˜μ—ˆλ˜ 보외법을 기초둜 ν•˜κ³  μžˆλ‹€. μ—¬κΈ°μ—μ„œ μš°λ¦¬λŠ” ν¬κ²Œλ³΄μ•„μ„œ 두가지 μ•Œκ³ λ¦¬λ“¬μ„ μ œμ•ˆν•˜κ³ μž ν•œλ‹€. 첫번째 방법은 κ°€μ†ν™”ν•œ Bregman 방법이며, μ••μΆ•μ„Όμ‹±λ¬Έμ œμ— μ μš©ν•˜μ—¬μ„œ μ›λž˜μ˜ Bregman 방법보닀 κ°€μ†ν™”ν•œ 방법이 더 빠름을 ν™•μΈν•œλ‹€. λ‘λ²ˆμ§Έ 방법은 κ°€μ†ν™”ν•œ μ–΄κ·Έλ¨Όν‹°λ“œ λΌκ·Έλž‘μ§€μ•ˆ 방법을 ν™•μž₯ν•œ 것인데, μ–΄κ·Έλ¨Όν‹°λ“œ λΌκ·Έλž‘μ§€μ•ˆ 방법은 λ‚΄λΆ€ 문제λ₯Ό 가지고 있고, 이런 λ‚΄λΆ€λ¬Έμ œλŠ” 일반적으둜 μ •ν™•ν•œ 닡을 계산할 수 μ—†λ‹€. κ·Έλ ‡κΈ° λ•Œλ¬Έμ— 이런 λ‚΄λΆ€λ¬Έμ œλ₯Ό μ λ‹Ήν•œ 쑰건을 λ§Œμ‘±ν•˜λ„λ‘ λΆ€μ •ν™•ν•˜ 게 풀더라도 κ°€μ†ν™”ν•œ μ–΄κ·Έλ¨Όν‹°λ“œ λΌκ·Έλž‘μ§€ 방법이 μ •ν™•ν•˜κ²Œ λ‚΄λΆ€λ¬Έμ œλ₯Ό ν’€λ•Œμ™€ 같은 μˆ˜λ ΄μ„±μ„ κ°–λŠ” 쑰건을 μ œμ‹œν•œλ‹€. μš°λ¦¬λŠ” λ˜ν•œ κ°€μ†ν™”ν•œ μ–Όν„° λ„€μ΄νŒ… λ””λ ‰μ…˜ 방법데 λŒ€ν•΄μ„œλ„ λΉ„μŠ·ν•œ λ‚΄μš©μ„ μ „κ°œν•œλ‹€.Abstract i 1 Introduction 1 2 Previous Methods 5 2.1 Mathematical Preliminary . . . . . . . . . . . . . . . . . . . . 5 2.2 The algorithms for solving the linearly constrained convex minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.2.1 Augmented Lagrangian Method . . . . . . . . . . . . . 8 2.2.2 Bregman Methods . . . . . . . . . . . . . . . . . . . . 9 2.2.3 Alternating direction method of multipliers . . . . . . . 13 2.3 The accelerating algorithms for unconstrained convex minimization problem . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.3.1 Fast inexact iterative shrinkage thresholding algorithm 16 2.3.2 Inexact accelerated proximal point method . . . . . . . 19 3 Proposed Algorithms 23 3.1 Proposed Algorithm 1 : Accelerated Bregman method . . . . . 23 3.1.1 Equivalence to the accelerated augmented Lagrangian method . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.1.2 Complexity of the accelerated Bregman method . . . . 27 3.2 Proposed Algorithm 2 : I-AALM . . . . . . . . . . . . . . . . 35 3.3 Proposed Algorithm 3 : I-AADMM . . . . . . . . . . . . . . . 43 3.4 Numerical Results . . . . . . . . . . . . . . . . . . . . . . . . . 54 3.4.1 Comparison to Bregman method with accelerated Bregman method . . . . . . . . . . . . . . . . . . . . . . . . 54 3.4.2 Numerical results of inexact accelerated augmented Lagrangian method using various subproblem solvers . . . 60 3.4.3 Comparison to the inexact accelerated augmented Lagrangian method with other methods . . . . . . . . . . 63 3.4.4 Inexact accelerated alternating direction method of multipliers for Multiplicative Noise Removal . . . . . . . . 69 4 Conclusion 86 Abstract (in Korean) 94Docto
    • …
    corecore