2 research outputs found
On the Link between Gaussian Homotopy Continuation and Convex Envelopes
Abstract. The continuation method is a popular heuristic in computer vision for nonconvex optimization. The idea is to start from a simpli-fied problem and gradually deform it to the actual task while tracking the solution. It was first used in computer vision under the name of graduated nonconvexity. Since then, it has been utilized explicitly or im-plicitly in various applications. In fact, state-of-the-art optical flow and shape estimation rely on a form of continuation. Despite its empirical success, there is little theoretical understanding of this method. This work provides some novel insights into this technique. Specifically, there are many ways to choose the initial problem and many ways to progres-sively deform it to the original task. However, here we show that when this process is constructed by Gaussian smoothing, it is optimal in a specific sense. In fact, we prove that Gaussian smoothing emerges from the best affine approximation to Veseโs nonlinear PDE. The latter PDE evolves any function to its convex envelope, hence providing the optimal convexification
ํ๋ฅ ๋ก ์ ์์ฐจ์ ๊ทธ๋ํ ๊ทผ์ฌ๋ฅผ ์ด์ฉํ MRF ์ต์ ํ
ํ์๋
ผ๋ฌธ (์์ฌ)-- ์์ธ๋ํ๊ต ๋ํ์ : ์ ๊ธฐยท์ปดํจํฐ๊ณตํ๋ถ, 2015. 2. ์ด๊ฒฝ๋ฌด.Markov random elds have been powerful models in computer vision but tractable
algorithms to obtain exact solution for the corresponding energy functions are lim-
itedapproximate solutions, in most cases are provided for efficiency. In this work
graduated optimization technique is applied in a novel way to develop an efficient al-
gorithm for solving general multi-label MRF optimization problem called Stochastic
Graduated graph approximation (SGGA) algorithm. The algorithm initially min-
imizes a simplied function and progressively transforms that function until it is
equivalent to the original function. However, it is hard to nd how to generate the
sequence of intermediate functions and what parameter to use for making transition
from one problem to another. For this we propose a new iterative method of build-
ing the sequence of approximations for the original energy function. We exploit a
stochastic method to generate intermediate functions, which guides the intermedi-
ate solutions to the near-optimal solution for the original problem. The transition
from one intermediate problem to another is controlled by the schedule of gradual
addition of edges. In each iteration, a deterministic algorithm such as block ICM is
applied to minimize intermediate functions and to generate initial solution for the
next problem. The proposed algorithm guarantees the convergence of local mini-
mum. We test on a synthetic image deconvolution problem and also on the set of
experiments with the OpenGM2 benchmark.Abstract i
Contents iii
List of Figures iv
List of Tables viii
1 Introduction 2
1.1 Background of research . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3 Outline of thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 Related works 8
2.1 Graduated optimization . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2 Sequential Monte Carlo . . . . . . . . . . . . . . . . . . . . . . . . . 11
3 Stochastic graduated graph approximation 13
3.1 Graph approximation by scanlines . . . . . . . . . . . . . . . . . . . 13
3.2 Graph approximation by trees . . . . . . . . . . . . . . . . . . . . . . 18
4 Minimization of intermediate energy functions 20
4.1 Block Iterated conditional modes . . . . . . . . . . . . . . . . . . . . 20
4.1.1 Block Iterated conditional modes: general idea . . . . . . . . 20
4.1.2 Block ICM for graduated graph approximation . . . . . . . . 21
4.2 Dynamic programming . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.2.1 Dynamic programming: general idea . . . . . . . . . . . . . . 23
4.2.2 The DP algorithm on scanlines for graduated graph approxi-
mation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.3 The DP algorithm on trees . . . . . . . . . . . . . . . . . . . 27
5 Experiments 29
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2 Image deconvolution . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.3 OpenGM2 benchmark . . . . . . . . . . . . . . . . . . . . . . . . . . 43
6 Conclusion 51
Bibliography 52
59
60Maste