986 research outputs found

    Convexity in source separation: Models, geometry, and algorithms

    Get PDF
    Source separation or demixing is the process of extracting multiple components entangled within a signal. Contemporary signal processing presents a host of difficult source separation problems, from interference cancellation to background subtraction, blind deconvolution, and even dictionary learning. Despite the recent progress in each of these applications, advances in high-throughput sensor technology place demixing algorithms under pressure to accommodate extremely high-dimensional signals, separate an ever larger number of sources, and cope with more sophisticated signal and mixing models. These difficulties are exacerbated by the need for real-time action in automated decision-making systems. Recent advances in convex optimization provide a simple framework for efficiently solving numerous difficult demixing problems. This article provides an overview of the emerging field, explains the theory that governs the underlying procedures, and surveys algorithms that solve them efficiently. We aim to equip practitioners with a toolkit for constructing their own demixing algorithms that work, as well as concrete intuition for why they work

    Data-Driven Shape Analysis and Processing

    Full text link
    Data-driven methods play an increasingly important role in discovering geometric, structural, and semantic relationships between 3D shapes in collections, and applying this analysis to support intelligent modeling, editing, and visualization of geometric data. In contrast to traditional approaches, a key feature of data-driven approaches is that they aggregate information from a collection of shapes to improve the analysis and processing of individual shapes. In addition, they are able to learn models that reason about properties and relationships of shapes without relying on hard-coded rules or explicitly programmed instructions. We provide an overview of the main concepts and components of these techniques, and discuss their application to shape classification, segmentation, matching, reconstruction, modeling and exploration, as well as scene analysis and synthesis, through reviewing the literature and relating the existing works with both qualitative and numerical comparisons. We conclude our report with ideas that can inspire future research in data-driven shape analysis and processing.Comment: 10 pages, 19 figure

    Thick Cloud Removal of Remote Sensing Images Using Temporal Smoothness and Sparsity-Regularized Tensor Optimization

    Full text link
    In remote sensing images, the presence of thick cloud accompanying cloud shadow is a high probability event, which can affect the quality of subsequent processing and limit the scenarios of application. Hence, removing the thick cloud and cloud shadow as well as recovering the cloud-contaminated pixels is indispensable to make good use of remote sensing images. In this paper, a novel thick cloud removal method for remote sensing images based on temporal smoothness and sparsity-regularized tensor optimization (TSSTO) is proposed. The basic idea of TSSTO is that the thick cloud and cloud shadow are not only sparse but also smooth along the horizontal and vertical direction in images while the clean images are smooth along the temporal direction between images. Therefore, the sparsity norm is used to boost the sparsity of the cloud and cloud shadow, and unidirectional total variation (UTV) regularizers are applied to ensure the unidirectional smoothness. This paper utilizes alternation direction method of multipliers to solve the presented model and generate the cloud and cloud shadow element as well as the clean element. The cloud and cloud shadow element is purified to get the cloud area and cloud shadow area. Then, the clean area of the original cloud-contaminated images is replaced to the corresponding area of the clean element. Finally, the reference image is selected to reconstruct details of the cloud area and cloud shadow area using the information cloning method. A series of experiments are conducted both on simulated and real cloud-contaminated images from different sensors and with different resolutions, and the results demonstrate the potential of the proposed TSSTO method for removing cloud and cloud shadow from both qualitative and quantitative viewpoints

    A survey of face recognition techniques under occlusion

    Get PDF
    The limited capacity to recognize faces under occlusions is a long-standing problem that presents a unique challenge for face recognition systems and even for humans. The problem regarding occlusion is less covered by research when compared to other challenges such as pose variation, different expressions, etc. Nevertheless, occluded face recognition is imperative to exploit the full potential of face recognition for real-world applications. In this paper, we restrict the scope to occluded face recognition. First, we explore what the occlusion problem is and what inherent difficulties can arise. As a part of this review, we introduce face detection under occlusion, a preliminary step in face recognition. Second, we present how existing face recognition methods cope with the occlusion problem and classify them into three categories, which are 1) occlusion robust feature extraction approaches, 2) occlusion aware face recognition approaches, and 3) occlusion recovery based face recognition approaches. Furthermore, we analyze the motivations, innovations, pros and cons, and the performance of representative approaches for comparison. Finally, future challenges and method trends of occluded face recognition are thoroughly discussed

    큰 κ·Έλž˜ν”„ μƒμ—μ„œμ˜ κ°œμΈν™”λœ νŽ˜μ΄μ§€ λž­ν¬μ— λŒ€ν•œ λΉ λ₯Έ 계산 기법

    Get PDF
    ν•™μœ„λ…Όλ¬Έ (박사) -- μ„œμšΈλŒ€ν•™κ΅ λŒ€ν•™μ› : κ³΅κ³ΌλŒ€ν•™ 전기·컴퓨터곡학뢀, 2020. 8. 이상ꡬ.Computation of Personalized PageRank (PPR) in graphs is an important function that is widely utilized in myriad application domains such as search, recommendation, and knowledge discovery. Because the computation of PPR is an expensive process, a good number of innovative and efficient algorithms for computing PPR have been developed. However, efficient computation of PPR within very large graphs with over millions of nodes is still an open problem. Moreover, previously proposed algorithms cannot handle updates efficiently, thus, severely limiting their capability of handling dynamic graphs. In this paper, we present a fast converging algorithm that guarantees high and controlled precision. We improve the convergence rate of traditional Power Iteration method by adopting successive over-relaxation, and initial guess revision, a vector reuse strategy. The proposed method vastly improves on the traditional Power Iteration in terms of convergence rate and computation time, while retaining its simplicity and strictness. Since it can reuse the previously computed vectors for refreshing PPR vectors, its update performance is also greatly enhanced. Also, since the algorithm halts as soon as it reaches a given error threshold, we can flexibly control the trade-off between accuracy and time, a feature lacking in both sampling-based approximation methods and fully exact methods. Experiments show that the proposed algorithm is at least 20 times faster than the Power Iteration and outperforms other state-of-the-art algorithms.κ·Έλž˜ν”„ λ‚΄μ—μ„œ κ°œμΈν™”λœ νŽ˜μ΄μ§€λž­ν¬ (P ersonalized P age R ank, PPR λ₯Ό κ³„μ‚°ν•˜λŠ” 것은 검색 , μΆ”μ²œ , μ§€μ‹λ°œκ²¬ λ“± μ—¬λŸ¬ λΆ„μ•Όμ—μ„œ κ΄‘λ²”μœ„ν•˜κ²Œ ν™œμš©λ˜λŠ” μ€‘μš”ν•œ μž‘μ—… 이닀 . κ°œμΈν™”λœ νŽ˜μ΄μ§€λž­ν¬λ₯Ό κ³„μ‚°ν•˜λŠ” 것은 κ³ λΉ„μš©μ˜ 과정이 ν•„μš”ν•˜λ―€λ‘œ , κ°œμΈν™”λœ νŽ˜μ΄μ§€λž­ν¬λ₯Ό κ³„μ‚°ν•˜λŠ” 효율적이고 ν˜μ‹ μ μΈ 방법듀이 λ‹€μˆ˜ κ°œλ°œλ˜μ–΄μ™”λ‹€ . κ·ΈλŸ¬λ‚˜ 수백만 μ΄μƒμ˜ λ…Έλ“œλ₯Ό 가진 λŒ€μš©λŸ‰ κ·Έλž˜ν”„μ— λŒ€ν•œ 효율적인 계산은 μ—¬μ „νžˆ ν•΄κ²°λ˜μ§€ μ•Šμ€ λ¬Έμ œμ΄λ‹€ . 그에 λ”ν•˜μ—¬ , κΈ°μ‘΄ μ œμ‹œλœ μ•Œκ³ λ¦¬λ“¬λ“€μ€ κ·Έλž˜ν”„ 갱신을 효율적으둜 닀루지 λͺ»ν•˜μ—¬ λ™μ μœΌλ‘œ λ³€ν™”ν•˜λŠ” κ·Έλž˜ν”„λ₯Ό λ‹€λ£¨λŠ” 데에 ν•œκ³„μ μ΄ 크닀 . λ³Έ μ—°κ΅¬μ—μ„œλŠ” 높은 정밀도λ₯Ό 보μž₯ν•˜κ³  정밀도λ₯Ό ν†΅μ œ κ°€λŠ₯ν•œ , λΉ λ₯΄κ²Œ μˆ˜λ ΄ν•˜λŠ” κ°œμΈν™”λœ νŽ˜μ΄μ§€λž­ν¬ 계산 μ•Œκ³ λ¦¬λ“¬μ„ μ œμ‹œν•œλ‹€ . 전톡적인 κ±°λ“­μ œκ³±λ²• (Power 에 좕차가속완화법 (Successive Over Relaxation) κ³Ό 초기 μΆ”μΈ‘ κ°’ 보정법 (Initial Guess 을 ν™œμš©ν•œ 벑터 μž¬μ‚¬μš© μ „λž΅μ„ μ μš©ν•˜μ—¬ 수렴 속도λ₯Ό κ°œμ„ ν•˜μ˜€λ‹€ . μ œμ‹œλœ 방법은 κΈ°μ‘΄ κ±°λ“­μ œκ³±λ²•μ˜ μž₯점인 λ‹¨μˆœμ„±κ³Ό 엄밀성을 μœ μ§€ ν•˜λ©΄μ„œ 도 수렴율과 계산속도λ₯Ό 크게 κ°œμ„  ν•œλ‹€ . λ˜ν•œ κ°œμΈν™”λœ νŽ˜μ΄μ§€λž­ν¬ λ²‘ν„°μ˜ 갱신을 μœ„ν•˜μ—¬ 이전에 계산 λ˜μ–΄ μ €μž₯된 벑터λ₯Ό μž¬μ‚¬μš©ν•˜ μ—¬ , κ°±μ‹  에 λ“œλŠ” μ‹œκ°„μ΄ 크게 λ‹¨μΆ•λœλ‹€ . λ³Έ 방법은 주어진 였차 ν•œκ³„μ— λ„λ‹¬ν•˜λŠ” μ¦‰μ‹œ 결과값을 μ‚°μΆœν•˜λ―€λ‘œ 정확도와 κ³„μ‚°μ‹œκ°„μ„ μœ μ—°ν•˜κ²Œ μ‘°μ ˆν•  수 있으며 μ΄λŠ” ν‘œλ³Έ 기반 μΆ”μ •λ°©λ²•μ΄λ‚˜ μ •ν™•ν•œ 값을 μ‚°μΆœν•˜λŠ” μ—­ν–‰λ ¬ 기반 방법 이 가지지 λͺ»ν•œ νŠΉμ„±μ΄λ‹€ . μ‹€ν—˜ κ²°κ³Ό , λ³Έ 방법은 κ±°λ“­μ œκ³±λ²•μ— λΉ„ν•˜μ—¬ 20 λ°° 이상 λΉ λ₯΄κ²Œ μˆ˜λ ΄ν•œλ‹€λŠ” 것이 ν™•μΈλ˜μ—ˆμœΌλ©° , κΈ° μ œμ‹œλœ 졜고 μ„±λŠ₯ 의 μ•Œκ³ λ¦¬ 듬 보닀 μš°μˆ˜ν•œ μ„±λŠ₯을 λ³΄μ΄λŠ” 것 λ˜ν•œ ν™•μΈλ˜μ—ˆλ‹€1 Introduction 1 2 Preliminaries: Personalized PageRank 4 2.1 Random Walk, PageRank, and Personalized PageRank. 5 2.1.1 Basics on Random Walk 5 2.1.2 PageRank. 6 2.1.3 Personalized PageRank 8 2.2 Characteristics of Personalized PageRank. 9 2.3 Applications of Personalized PageRank. 12 2.4 Previous Work on Personalized PageRank Computation. 17 2.4.1 Basic Algorithms 17 2.4.2 Enhanced Power Iteration 18 2.4.3 Bookmark Coloring Algorithm. 20 2.4.4 Dynamic Programming 21 2.4.5 Monte-Carlo Sampling. 22 2.4.6 Enhanced Direct Solving 24 2.5 Summary 26 3 Personalized PageRank Computation with Initial Guess Revision 30 3.1 Initial Guess Revision and Relaxation 30 3.2 Finding Optimal Weight of Successive Over Relaxation for PPR. 34 3.3 Initial Guess Construction Algorithm for Personalized PageRank. 36 4 Fully Personalized PageRank Algorithm with Initial Guess Revision 42 4.1 FPPR with IGR. 42 4.2 Optimization. 49 4.3 Experiments. 52 5 Personalized PageRank Query Processing with Initial Guess Revision 56 5.1 PPR Query Processing with IGR 56 5.2 Optimization. 64 5.3 Experiments. 67 6 Conclusion 74 Bibliography 77 Appendix 88 Abstract (In Korean) 90Docto

    Data-driven shape analysis and processing

    Get PDF
    Data-driven methods serve an increasingly important role in discovering geometric, structural, and semantic relationships between shapes. In contrast to traditional approaches that process shapes in isolation of each other, data-driven methods aggregate information from 3D model collections to improve the analysis, modeling and editing of shapes. Through reviewing the literature, we provide an overview of the main concepts and components of these methods, as well as discuss their application to classification, segmentation, matching, reconstruction, modeling and exploration, as well as scene analysis and synthesis. We conclude our report with ideas that can inspire future research in data-driven shape analysis and processing
    • …
    corecore