1,151 research outputs found

    ๋ฆฌ๋งŒ ์ตœ์ ํ™”์™€ ๊ทธ๋ž˜ํ”„ ์‹ ๊ฒฝ๋ง์— ๊ธฐ๋ฐ˜ํ•œ ์ € ๋žญํฌ ํ–‰๋ ฌ์™„์„ฑ ์•Œ๊ณ ๋ฆฌ๋“ฌ์— ๊ด€ํ•œ ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๊ณต๊ณผ๋Œ€ํ•™ ์ „๊ธฐยท์ •๋ณด๊ณตํ•™๋ถ€,2020. 2. ์‹ฌ๋ณ‘ํšจ.์ตœ๊ทผ, ์ผ๋ถ€์˜ ๊ด€์ธก์น˜๋กœ๋ถ€ํ„ฐ ํ–‰๋ ฌ์˜ ๋ชจ๋“  ์›์†Œ๋“ค์„ ๋ณต์›ํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ ์ € ๋žญํฌ ํ–‰๋ ฌ ์™„์„ฑ (LRMC)์ด ๋งŽ์€ ์ฃผ๋ชฉ์„ ๋ฐ›๊ณ  ์žˆ๋‹ค. LRMC๋Š” ์ถ”์ฒœ ์‹œ์Šคํ…œ, ์œ„์ƒ ๋ณต์›, ์‚ฌ๋ฌผ ์ธํ„ฐ๋„ท ์ง€์—ญํ™”, ์˜์ƒ ์žก์Œ ์ œ๊ฑฐ, ๋ฐ€๋ฆฌ๋ฏธํ„ฐ ์›จ์ด๋ธŒ ํ†ต ์‹ ๋“ฑ์„ ํฌํ•จํ•œ ๋‹ค์–‘ํ•œ ์‘์šฉ๋ถ„์•ผ์—์„œ ์‚ฌ์šฉ๋˜๊ณ  ์žˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” LRMC์— ๋Œ€ํ•ด ์—ฐ๊ตฌํ•˜์—ฌ LRMC์˜ ๊ฐ€๋Šฅ์„ฑ๊ณผ ํ•œ๊ณ„์— ๋Œ€ํ•œ ๋” ๋‚˜์€ ์ดํ•ด๋ฅผ ํ•  ์ˆ˜ ์žˆ๋„๋ก ๊ธฐ์กด ๊ฒฐ๊ณผ๋“ค์„ ๊ตฌ์กฐ์ ์ด๊ณ  ์ ‘๊ทผ ๊ฐ€๋Šฅํ•œ ๋ฐฉ์‹์œผ๋กœ ๋ถ„๋ฅ˜ํ•œ๋‹ค. ๊ตฌ์ฒด์ ์œผ๋กœ, ์ตœ์‹  LRMC ๊ธฐ๋ฒ•๋“ค์„ ๋‘ ๊ฐ€์ง€ ๋ฒ”์ฃผ๋กœ ๋ถ„๋ฅ˜ํ•œ ๋‹ค์Œ ๊ฐ๊ฐ ์˜๋ฒ”์ฃผ๋ฅผ ๋ถ„์„ํ•œ๋‹ค. ํŠนํžˆ, ํ–‰๋ ฌ์˜ ๊ณ ์œ ํ•œ ์„ฑ์งˆ๊ณผ ๊ฐ™์€ LRMC ๊ธฐ๋ฒ•์„ ์‚ฌ์šฉ ํ• ๋•Œ ๊ณ ๋ คํ•ด์•ผ ํ•  ์‚ฌํ•ญ๋“ค์„ ๋ถ„์„ํ•œ๋‹ค. ๊ธฐ์กด์˜ LRMC ๊ธฐ๋ฒ•์€ ๊ฐ€์šฐ์‹œ์•ˆ ๋žœ ๋คํ–‰๋ ฌ๊ณผ ๊ฐ™์€ ์ผ๋ฐ˜์ ์ธ ์ƒํ™ฉ์—์„œ ์„ฑ๊ณต์ ์ด์—ˆ์œผ๋‚˜ ๋งŽ์€ ์‹ค์ œ ์ƒํ™ฉ์—์„œ ๋Š”๋ณต์›ํ•˜๊ณ ์ž ํ•˜๋Š” ์ € ๋žญํฌ ํ–‰๋ ฌ์ด ๊ทธ๋ž˜ํ”„ ๊ตฌ์กฐ ๋˜๋Š” ๋‹ค์–‘์ฒด ๊ตฌ์กฐ์™€ ๊ฐ™์€ ๋น„์œ ํด๋ฆฌ๋“œ ๊ตฌ์กฐ๋ฅผ ๊ฐ€์งˆ ์ˆ˜ ์žˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ์‹ค์ œ ์‘์šฉ์—์„œ LRMC์˜ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ค๊ธฐ ์œ„ํ•ด ์ด ๋Ÿฐ์ถ”๊ฐ€์ ์ธ ๊ตฌ์กฐ๊ฐ€ ํ™œ์šฉ๋  ์ˆ˜ ์žˆ์Œ์„ ๋ณด์ธ๋‹ค. ํŠนํžˆ, ์‚ฌ๋ฌผ ์ธํ„ฐ๋„ท ๋„คํŠธ์›Œ ํฌ์ง€์—ญํ™”๋ฅผ ์œ„ํ•œ ์œ ํด๋ฆฌ๋“œ ๊ฑฐ๋ฆฌ ํ–‰๋ ฌ ์™„์„ฑ ์•Œ๊ณ ๋ฆฌ๋“ฌ์„ ์ œ์•ˆํ•œ๋‹ค. ์œ ํด๋ฆฌ ๋“œ๊ฑฐ๋ฆฌ ํ–‰๋ ฌ์„ ๋‚ฎ์€ ๋žญํฌ๋ฅผ ๊ฐ–๋Š” ์–‘์˜ ์ค€์ •๋ถ€ํ˜ธ ํ–‰๋ ฌ์˜ ํ•จ์ˆ˜๋กœ ํ‘œํ˜„ํ•œ๋‹ค. ์ด๋Ÿฌํ•œ ์–‘์˜ ์ค€์ •๋ถ€ํ˜ธ ํ–‰๋ ฌ๋“ค์˜ ์ง‘ํ•ฉ์€ ๋ฏธ๋ถ„์ด ์ž˜ ์ •์˜๋˜์–ด ์žˆ๋Š” ๋ฆฌ ๋งŒ๋‹ค์–‘์ฒด๋ฅผ ํ˜•์„ฑํ•˜๋ฏ€๋กœ ์œ ํด๋ฆฌ๋“œ ๊ณต๊ฐ„์—์„œ์˜ ์•Œ๊ณ ๋ฆฌ๋“ฌ์„ ์ ๋‹นํžˆ ๋ณ€ํ˜•ํ•˜ ์—ฌLRMC์— ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค. LRMC๋ฅผ ์œ„ํ•ด ์šฐ๋ฆฌ๋Š” ์ผค๋ ˆ ๊ธฐ์šธ๊ธฐ๋ฅผ ํ™œ์šฉ ํ•œ๋ฆฌ๋งŒ ๋‹ค์–‘์ฒด์—์„œ์˜ ์ง€์—ญํ™” (LRM-CG)๋ผ ๋ถˆ๋ฆฌ๋Š” ๋ณ€๊ฒฝ๋œ ์ผค๋ ˆ ๊ธฐ์šธ๊ธฐ ๊ธฐ ๋ฐ˜์•Œ๊ณ ๋ฆฌ๋“ฌ์„ ์ œ์•ˆํ•œ๋‹ค. ์ œ์•ˆํ•˜๋Š” LRM-CG ์•Œ๊ณ ๋ฆฌ๋“ฌ์€ ๊ด€์ธก๋œ ์Œ ๊ฑฐ๋ฆฌ ๊ฐ€ํŠน์ด๊ฐ’์— ์˜ํ•ด ์˜ค์—ผ๋˜๋Š” ์‹œ๋‚˜๋ฆฌ์˜ค๋กœ ์‰ฝ๊ฒŒ ํ™•์žฅ ๋  ์ˆ˜ ์žˆ์Œ์„ ๋ณด์ธ๋‹ค. ์‹ค์ œ๋กœ ํŠน์ด๊ฐ’์„ ํฌ์†Œ ํ–‰๋ ฌ๋กœ ๋ชจ๋ธ๋ง ํ•œ ๋‹ค์Œ ํŠน์ด๊ฐ’ ํ–‰๋ ฌ์„ ๊ทœ์ œ ํ•ญ์œผ ๋กœLRMC์— ์ถ”๊ฐ€ํ•จ์œผ๋กœ์จ ํŠน์ด๊ฐ’์„ ํšจ๊ณผ์ ์œผ๋กœ ์ œ์–ด ํ•  ์ˆ˜ ์žˆ๋‹ค. ๋ถ„์„์„ ํ†ต ํ•ดLRM-CG ์•Œ๊ณ ๋ฆฌ๋“ฌ์ด ํ™•์žฅ๋œ Wolfe ์กฐ๊ฑด ์•„๋ž˜ ์›๋ž˜ ์œ ํด๋ฆฌ๋“œ ๊ฑฐ๋ฆฌ ํ–‰๋ ฌ ์—์„ ํ˜•์ ์œผ๋กœ ์ˆ˜๋ ดํ•˜๋Š” ๊ฒƒ์„ ๋ณด์ธ๋‹ค. ๋ชจ์˜ ์‹คํ—˜์„ ํ†ตํ•ด LRM-CG์™€ ํ™• ์žฅ๋ฒ„์ „์ด ์œ ํด๋ฆฌ๋“œ ๊ฑฐ๋ฆฌ ํ–‰๋ ฌ์„ ๋ณต๊ตฌํ•˜๋Š” ๋ฐ ํšจ๊ณผ์ ์ž„์„ ๋ณด์ธ๋‹ค. ๋˜ํ•œ, ๊ทธ๋ž˜ํ”„ ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜์—ฌ ํ‘œํ˜„๋  ์ˆ˜ ์žˆ๋Š” ์ € ๋žญํฌ ํ–‰๋ ฌ ๋ณต์›์„ ์œ„ ํ•œ๊ทธ๋ž˜ํ”„ ์‹ ๊ฒฝ๋ง (GNN) ๊ธฐ๋ฐ˜ ๊ธฐ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ๊ทธ๋ž˜ํ”„ ์‹ ๊ฒฝ๋ง ๊ธฐ๋ฐ˜์˜ LRM C(GNN-LRMC)๋ผ ๋ถˆ๋ฆฌ๋Š” ๊ธฐ๋ฒ•์€ ๋ณต์›ํ•˜๊ณ ์ž ํ•˜๋Š” ํ–‰๋ ฌ์˜ ๊ทธ๋ž˜ํ”„ ์˜ ์—ญํŠน์ง•๋“ค์„ ์ถ”์ถœํ•˜๊ธฐ ์œ„ํ•ด ๋ณ€ํ˜•๋œ ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์„ ์‚ฌ์šฉํ•œ๋‹ค. ์ด๋ ‡๊ฒŒ ์ถ”์ถœ ๋œํŠน์ง•๋“ค์„ GNN์˜ ํ•™์Šต ๊ณผ์ •์— ํ™œ์šฉํ•˜์—ฌ ํ–‰๋ ฌ์˜ ์›์†Œ๋“ค์„ ๋ณต์›ํ•  ์ˆ˜ ์žˆ๋‹ค. ํ•ฉ์„ฑ ๋ฐ ์‹ค์ œ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•œ ๋ชจ์˜ ์‹คํ—˜์„ ํ†ตํ•˜์—ฌ ์ œ์•ˆํ•˜๋Š” GNN -LRMC์˜ ์šฐ์ˆ˜ํ•œ ๋ณต๊ตฌ ์„ฑ๋Šฅ์„ ๋ณด์˜€๋‹ค.In recent years, low-rank matrix completion (LRMC) has received much attention as a paradigm to recover the unknown entries of a matrix from partial observations. It has a wide range of applications in many areas, including recommendation system, phase retrieval, IoT localization, image denoising, milimeter wave (mmWave) communication, to name just a few. In this dissertation, we present a comprehensive overview of low-rank matrix completion. In order to have better view, insight, and understanding of potentials and limitations of LRMC, we present early scattered results in a structured and accessible way. To be specific, we classify the state-of-the-art LRMC techniques into two main categories and then explain each category in detail. We further discuss issues to be considered, including intrinsic properties required for the matrix recovery, when one would like to use LRMC techniques. However, conventional LRMC techniques have been most successful on a general setting of the low-rank matrix, say, Gaussian random matrix. In many practical situations, the desired low rank matrix might have an underlying non-Euclidean structure, such as graph or manifold structure. In our work, we show that such additional data structures can be exploited to improve the recovery performance of LRMC in real-life applications. In particular, we propose a Euclidean distance matrix completion algorithm for internet of things (IoT) network localization. In our approach, we express the Euclidean distance matrix as a function of the low rank positive semidefinite (PSD) matrix. Since the set of these PSD matrices forms a Riemannian manifold in which the notation of differentiability can be defined, we can recycle, after a proper modification, an algorithm in the Euclidean space. In order to solve the low-rank matrix completion, we propose a modified conjugate gradient algorithm, referred to as localization in Riemannian manifold using conjugate gradient (LRM-CG). We also show that the proposed LRM-CG algorithm can be easily extended to the scenario in which the observed pairwise distances are contaminated by the outliers. In fact, by modeling outliers as a sparse matrix and then adding a regularization term of the outlier matrix into the low-rank matrix completion problem, we can effectively control the outliers. From the convergence analysis, we show that LRM-CG converges linearly to the original Euclidean distance matrix under the extended Wolfes conditions. From the numerical experiments, we demonstrate that LRM-CG as well as its extended version is effective in recovering the Euclidean distance matrix. In order to solve the LRMC problem in which the desired low-rank matrix can be expressed using a graph model, we also propose a graph neural network (GNN) scheme. Our approach, referred to as graph neural network-based low-rank matrix completion (GNN-LRMC), is to use a modified convolution operation to extract the features across the graph domain. The feature data enable the training process of the proposed GNN to reconstruct the unknown entries and also optimize the graph model of the desired low-rank matrix. We demonstrate the reconstruction performance of the proposed GNN-LRMC using synthetic and real-life datasets.Abstract i Contents iii List of Tables vii List of Figures viii 1 Introduction 2 1.1 Motivation 2 1.2 Outline of the dissertation 5 2 Low-Rank Matrix Completion 6 2.1 LRMC Applications 6 2.1.1 Recommendation system 6 2.1.2 Phase retrieval 8 2.1.3 Localization in IoT networks 8 2.1.4 Image compression and restoration 10 2.1.5 Massive multiple-input multiple-output (MIMO) 12 2.1.6 Millimeter wave (mmWave) communication 12 2.2 Intrinsic Properties of LRMC 13 2.2.1 Sparsity of Observed Entries 13 2.2.2 Coherence 18 2.3 Rank Minimization Problem 22 2.4 LRMC Algorithms Without the Rank Information 25 2.4.1 Nuclear Norm Minimization (NNM) 25 2.4.2 Singular Value Thresholding (SVT) 28 2.4.3 Iteratively Reweighted Least Squares (IRLS) Minimization 31 2.5 LRMC Algorithms Using Rank Information 32 2.5.1 Greedy Techniques 34 2.5.2 Alternating Minimization Techniques 37 2.5.3 Optimization over Smooth Riemannian Manifold 39 2.5.4 Truncated NNM 41 2.6 Performance Guarantee 44 2.7 Empirical Performance Evaluation 46 2.8 Choosing the Right Matrix Completion Algorithms 55 3 IoT Localization Via LRMC 56 3.1 Problem Model 57 3.2 Optimization over Riemannian Manifold 61 3.3 Localization in Riemannian Manifold Using Conjugate Gradient (LRMCG) 66 3.4 Computational Complexity 71 3.5 Recovery Condition Analysis 73 3.5.1 Convergence of LRM-CG at Sampled Entries 73 3.5.2 Exact Recovery of Euclidean Distance Matrices 79 3.5.3 Discussion on A3 86 4 Extended LRM-CG for The Outlier Problem 92 4.1 Problem Model 94 4.2 Extended LRM-CG 94 4.3 Numerical Evaluation 97 4.3.1 Simulation Setting 98 4.3.2 Convergence Efficiency 99 4.3.3 Performance Evaluation 99 4.3.4 Outlier Problem 107 4.3.5 Real Data 107 5 LRMC Via Graph Neural Network 112 5.1 Graph Model 116 5.2 Proposed GNN-LRMC 116 5.2.1 Adaptive Model 119 5.2.2 Multilayer GNN 119 5.2.3 Output Model 122 5.2.4 Training Cost Function 123 5.3 Numerical Evaluation 123 6 Conculsion 127 A Proof of Lemma 6 129 B Proof of Theorem 7 131 C Proof of Lemma 8 134 D Proof of Theorem 9 136 E Proof of Lemma 10 140 F Proof of Lemma 12 141 G Proof of Lemma 13 142 H Proof of Lemma 14 144 I Proof of Lemma 15 146 J Proof of Lemma 17 151 K Proof of Lemma 19 154 L Proof of Lemma 20 156 M Proof of Lemma 21 158 Abstract (In Korean) 173 Acknowlegement 175Docto
    • โ€ฆ
    corecore