54 research outputs found

    Sectorization and Configuration Transition in Airspace Design

    Get PDF

    Recent Advances in Graph Partitioning

    Full text link
    We survey recent trends in practical algorithms for balanced graph partitioning together with applications and future research directions

    High-Quality Hypergraph Partitioning

    Get PDF
    This dissertation focuses on computing high-quality solutions for the NP-hard balanced hypergraph partitioning problem: Given a hypergraph and an integer kk, partition its vertex set into kk disjoint blocks of bounded size, while minimizing an objective function over the hyperedges. Here, we consider the two most commonly used objectives: the cut-net metric and the connectivity metric. Since the problem is computationally intractable, heuristics are used in practice - the most prominent being the three-phase multi-level paradigm: During coarsening, the hypergraph is successively contracted to obtain a hierarchy of smaller instances. After applying an initial partitioning algorithm to the smallest hypergraph, contraction is undone and, at each level, refinement algorithms try to improve the current solution. With this work, we give a brief overview of the field and present several algorithmic improvements to the multi-level paradigm. Instead of using a logarithmic number of levels like traditional algorithms, we present two coarsening algorithms that create a hierarchy of (nearly) nn levels, where nn is the number of vertices. This makes consecutive levels as similar as possible and provides many opportunities for refinement algorithms to improve the partition. This approach is made feasible in practice by tailoring all algorithms and data structures to the nn-level paradigm, and developing lazy-evaluation techniques, caching mechanisms and early stopping criteria to speed up the partitioning process. Furthermore, we propose a sparsification algorithm based on locality-sensitive hashing that improves the running time for hypergraphs with large hyperedges, and show that incorporating global information about the community structure into the coarsening process improves quality. Moreover, we present a portfolio-based initial partitioning approach, and propose three refinement algorithms. Two are based on the Fiduccia-Mattheyses (FM) heuristic, but perform a highly localized search at each level. While one is designed for two-way partitioning, the other is the first FM-style algorithm that can be efficiently employed in the multi-level setting to directly improve kk-way partitions. The third algorithm uses max-flow computations on pairs of blocks to refine kk-way partitions. Finally, we present the first memetic multi-level hypergraph partitioning algorithm for an extensive exploration of the global solution space. All contributions are made available through our open-source framework KaHyPar. In a comprehensive experimental study, we compare KaHyPar with hMETIS, PaToH, Mondriaan, Zoltan-AlgD, and HYPE on a wide range of hypergraphs from several application areas. Our results indicate that KaHyPar, already without the memetic component, computes better solutions than all competing algorithms for both the cut-net and the connectivity metric, while being faster than Zoltan-AlgD and equally fast as hMETIS. Moreover, KaHyPar compares favorably with the current best graph partitioning system KaFFPa - both in terms of solution quality and running time

    Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

    Full text link
    We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the quality of k-way hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we generate a prior hypergraph under a relaxed balance constraint. With the prior hypergraph, we have defined the Wasserstein discrepancy to coordinate the optimal transport of coarsening process. And the optimal transport matrix is solved by Sinkhorn algorithm. Our coarsening scheme fully takes into account the minimization of connectivity metric (objective function). For the initial partitioning stage, we define a normalized cut function induced by Fiedler vector, which is theoretically proved to be a concave function. Thereby, a three-point algorithm is designed to find the best cut under the balance constraint

    ์œ ์ „ ์•Œ๊ณ ๋ฆฌ์ฆ˜์—์„œ์˜ ์ ์‘์  ์ง์ง“๊ธฐ ์ œ๋„

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์ „๊ธฐยท์ปดํ“จํ„ฐ๊ณตํ•™๋ถ€, 2017. 2. ๋ฌธ๋ณ‘๋กœ.์ง์ง“๊ธฐ ์ œ๋„๋Š” ์ž์‹ ํ•ด๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•˜์—ฌ ๋‘ ๋ถ€๋ชจ๋ฅผ ์„ ํƒํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋งํ•œ๋‹ค. ์ด๋Š” ์œ ์ „ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ๋™์ž‘ ์ „๋ฐ˜์— ์˜ํ–ฅ์„ ๋ผ์นœ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ ๋Š”,ํ—๊ฐ€๋ฆฌ์•ˆ๋ฐฉ๋ฒ•์„์‚ฌ์šฉํ•œ์ง์ง“๊ธฐ์ œ๋„์—๋Œ€ํ•ด์—ฐ๊ตฌํ•˜์˜€๋‹ค.๊ทธ์ œ๋„๋“ค์€ ๋Œ€์‘๋˜๋Š” ๊ฑฐ๋ฆฌ์˜ ํ•ฉ์„ ์ตœ์†Œํ™”ํ•˜๋Š” ๋ฐฉ๋ฒ•, ์ตœ๋Œ€ํ™”ํ•˜๋Š” ๋ฐฉ๋ฒ•, ๊ทธ๋ฆฌ๊ณ  ๋น„๊ต๋ฅผ ์œ„ํ•ด ๋žœ๋คํ•˜๊ฒŒ ๋Œ€์‘์‹œํ‚ค๋Š” ๋ฐฉ๋ฒ•๋“ค์„ ๊ฐ€๋ฆฌํ‚จ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ์ด ์ œ๋„๋“ค ์„์ž˜์•Œ๋ ค์ง„๋ฌธ์ œ์ธ์ˆœํšŒํŒ๋งค์›๋ฌธ์ œ์™€๊ทธ๋ž˜ํ”„๋ถ„ํ• ๋ฌธ์ œ์—์ ์šฉํ•˜์˜€๋‹ค. ๋˜ํ•œ ์„ธ๋Œ€๋ณ„๋กœ ๊ฐ€์žฅ ์ข‹์€ ํ•ด๊ฐ€ ์–ด๋–ป๊ฒŒ ๋ณ€ํ™”ํ•˜๋Š”์ง€ ๋ถ„์„ํ•˜์˜€๋‹ค. ์ด๋Ÿฌํ•œ ๋ถ„ ์„์— ๊ธฐ์ดˆํ•˜์—ฌ, ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๊ฐ„๋‹จํžˆ ๊ฒฐํ•ฉ๋œ ์ง์ง“๊ธฐ ์ œ๋„๋ฅผ ์ œ์•ˆํ•˜์˜€๋‹ค. ์ œ์•ˆ๋œ ์ œ๋„๋Š” ๊ฒฐํ•ฉ๋˜์ง€ ์•Š์€ ์ œ๋„์— ๋น„ํ•ด ๋” ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๋˜ํ•œ, ๋ณธ ๋…ผ๋ฌธ์˜ ํ•ต์‹ฌ ๋ฐฉ๋ฒ•์ธ ์ง์ง“๊ธฐ ์ œ๋„๋ฅผ ๊ฒฐํ•ฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์˜ ์ ์‘์ ์ธ ์ง์ง“๊ธฐ ๋ฐฉ๋ฒ•์€ ์„ธ ํ—๊ฐ€๋ฆฌ์•ˆ ์ œ๋„ ์ค‘ํ•˜๋‚˜๋ฅผ์„ ํƒํ•œ๋‹ค.๋ชจ๋“ ์ง์ง€์–ด์ง„์Œ์€๋‹ค์Œ์„ธ๋Œ€๋ฅผ์œ„ํ•œ์ง์ง“๊ธฐ๋ฐฉ๋ฒ•์„ ๊ฒฐ์ •ํ•  ํˆฌํ‘œ๊ถŒ์„ ๊ฐ–๊ฒŒ ๋œ๋‹ค. ๊ฐ๊ฐ์˜ ์„ ํ˜ธ๋„๋Š” ๋ถ€๋ชจํ•ด๊ฐ„ ๊ฑฐ๋ฆฌ์™€ ๋ถ€๋ชจํ•ด์™€ ์ž์‹ํ•ด์˜ ๊ฑฐ๋ฆฌ์˜ ๋น„์œจ์„ ํ†ตํ•ด ๊ฒฐ์ •๋œ๋‹ค. ์ œ์•ˆ๋œ ์ ์‘์  ๋ฐฉ๋ฒ•์€ ๋ชจ๋“  ๋‹จ์ผ ํ—๊ฐ€๋ฆฌ์•ˆ์ง์ง“๊ธฐ์ œ๋„,๋น„์ ์‘์ ์œผ๋กœ๊ฒฐํ•ฉ๋œ๋ฐฉ๋ฒ•,์ „ํ†ต์ ์ธ๋ฃฐ๋ ›ํœ ์„ ํƒ, ๊ธฐ์กด์˜๋‹ค๋ฅธ๊ฑฐ๋ฆฌ๊ธฐ์ค€๋ฐฉ๋ฒ•๋“ค๋ณด๋‹ค์ข‹์€๊ฒฐ๊ณผ๋ฅผ๋ณด์˜€๋‹ค.์ œ์•ˆ๋œ์ ์‘์ ๋ฐฉ ๋ฒ•์€์ •๊ธฐ์ ์ธํ•ด์ง‘๋‹จ์˜์œ ์ž…๊ณผ์ง€์—ญ์ตœ์ ํ™”์™€๊ฒฐํ•ฉ๋œํ™˜๊ฒฝ์—์„œ๋„์ ์ ˆํ•œ ์ œ๋„๋ฅผ ์„ ํƒํ–ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ํ—๊ฐ€๋ฆฌ์•ˆ ๋ฐฉ๋ฒ•์„ ์ตœ๋Œ€ ํ˜น์€ ์ตœ์†Œ์˜ ์ง€์—ญ ์ตœ์ ์ ์„์ฐพ๋Š”๋ฐฉ๋ฒ•์œผ๋กœ๊ต์ฒดํ–ˆ๋‹ค.์ด๋ฐฉ์‹์—ญ์‹œ์ง€์—ญ์ตœ์ ์ ์„์ฐพ๋Š”๋‹จ์ผ ๋ฐฉ๋ฒ•๋“ค๋ณด๋‹ค ์ข‹์€ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์˜€๋‹คI. Introduction 1 1.1 Motivation 1 1.2 Related Work 2 1.3 Contribution 4 1.4 Organization 6 II. Preliminary 7 2.1 Hungarian Method 7 2.2 Geometric Operators 10 2.2.1 Formal Definitions 10 2.3 Exploration Versus Exploitation Trade-off 11 2.4 Test Problems and Distance Metric 13 III. Hungarian Mating Scheme 15 3.1 Proposed Scheme 15 3.2 Tested GA 18 3.3 Observation 18 3.3.1 Traveling Salesman Problem 18 3.3.2 Graph Bisection Problem 21 IV. Hybrid and Adaptive Scheme 28 4.1 Simple Hybrid Scheme 28 4.2 Adaptive Scheme 30 4.2.1 Significance of Adaptive Scheme 30 4.2.2 Proposed Method 31 4.2.3 Theoretical Support 34 4.2.4 Experiments 36 4.2.5 Traveling Salesman Problem 36 4.2.6 Graph Bisection Problem 40 4.2.7 Comparison with Traditional Method 41 4.2.8 Comparison with Distance-based Methods 42 V. Tests in Various Environments 50 5.1 Hybrid GA 50 5.1.1 Experiment Settings 50 5.1.2 Results and Discussions 51 5.2 GA with New Individuals 52 5.2.1 Experiment Settings 52 5.2.2 Results and Discussions 53 VI. A Revised Version of Adaptive Method 62 6.1 Hungarian Mating Scheme 62 6.2 Experiment Settings 62 6.3 Results and Discussions 63 VII. Conclusion 67 7.1 Summary 67 7.2 Future Work 68Docto
    • โ€ฆ
    corecore