111 research outputs found

    High-Quality Hypergraph Partitioning

    Get PDF
    This dissertation focuses on computing high-quality solutions for the NP-hard balanced hypergraph partitioning problem: Given a hypergraph and an integer kk, partition its vertex set into kk disjoint blocks of bounded size, while minimizing an objective function over the hyperedges. Here, we consider the two most commonly used objectives: the cut-net metric and the connectivity metric. Since the problem is computationally intractable, heuristics are used in practice - the most prominent being the three-phase multi-level paradigm: During coarsening, the hypergraph is successively contracted to obtain a hierarchy of smaller instances. After applying an initial partitioning algorithm to the smallest hypergraph, contraction is undone and, at each level, refinement algorithms try to improve the current solution. With this work, we give a brief overview of the field and present several algorithmic improvements to the multi-level paradigm. Instead of using a logarithmic number of levels like traditional algorithms, we present two coarsening algorithms that create a hierarchy of (nearly) nn levels, where nn is the number of vertices. This makes consecutive levels as similar as possible and provides many opportunities for refinement algorithms to improve the partition. This approach is made feasible in practice by tailoring all algorithms and data structures to the nn-level paradigm, and developing lazy-evaluation techniques, caching mechanisms and early stopping criteria to speed up the partitioning process. Furthermore, we propose a sparsification algorithm based on locality-sensitive hashing that improves the running time for hypergraphs with large hyperedges, and show that incorporating global information about the community structure into the coarsening process improves quality. Moreover, we present a portfolio-based initial partitioning approach, and propose three refinement algorithms. Two are based on the Fiduccia-Mattheyses (FM) heuristic, but perform a highly localized search at each level. While one is designed for two-way partitioning, the other is the first FM-style algorithm that can be efficiently employed in the multi-level setting to directly improve kk-way partitions. The third algorithm uses max-flow computations on pairs of blocks to refine kk-way partitions. Finally, we present the first memetic multi-level hypergraph partitioning algorithm for an extensive exploration of the global solution space. All contributions are made available through our open-source framework KaHyPar. In a comprehensive experimental study, we compare KaHyPar with hMETIS, PaToH, Mondriaan, Zoltan-AlgD, and HYPE on a wide range of hypergraphs from several application areas. Our results indicate that KaHyPar, already without the memetic component, computes better solutions than all competing algorithms for both the cut-net and the connectivity metric, while being faster than Zoltan-AlgD and equally fast as hMETIS. Moreover, KaHyPar compares favorably with the current best graph partitioning system KaFFPa - both in terms of solution quality and running time

    Quantum and Classical Multilevel Algorithms for (Hyper)Graphs

    Get PDF
    Combinatorial optimization problems on (hyper)graphs are ubiquitous in science and industry. Because many of these problems are NP-hard, development of sophisticated heuristics is of utmost importance for practical problems. In recent years, the emergence of Noisy Intermediate-Scale Quantum (NISQ) computers has opened up the opportunity to dramaticaly speedup combinatorial optimization. However, the adoption of NISQ devices is impeded by their severe limitations, both in terms of the number of qubits, as well as in their quality. NISQ devices are widely expected to have no more than hundreds to thousands of qubits with very limited error-correction, imposing a strict limit on the size and the structure of the problems that can be tackled directly. A natural solution to this issue is hybrid quantum-classical algorithms that combine a NISQ device with a classical machine with the goal of capturing “the best of both worlds”. Being motivated by lack of high quality optimization solvers for hypergraph partitioning, in this thesis, we begin by discussing classical multilevel approaches for this problem. We present a novel relaxation-based vertex similarity measure termed algebraic distance for hypergraphs and the coarsening schemes based on it. Extending the multilevel method to include quantum optimization routines, we present Quantum Local Search (QLS) – a hybrid iterative improvement approach that is inspired by the classical local search approaches. Next, we introduce the Multilevel Quantum Local Search (ML-QLS) that incorporates the quantum-enhanced iterative improvement scheme introduced in QLS within the multilevel framework, as well as several techniques to further understand and improve the effectiveness of Quantum Approximate Optimization Algorithm used throughout our work

    Combinatorial Optimization

    Get PDF
    This report summarizes the meeting on Combinatorial Optimization where new and promising developments in the field were discussed. Th

    Computational Optimization Techniques for Graph Partitioning

    Get PDF
    Partitioning graphs into two or more subgraphs is a fundamental operation in computer science, with applications in large-scale graph analytics, distributed and parallel data processing, and fill-reducing orderings in sparse matrix algorithms. Computing balanced and minimally connected subgraphs is a common pre-processing step in these areas, and must therefore be done quickly and efficiently. Since graph partitioning is NP-hard, heuristics must be used. These heuristics must balance the need to produce high quality partitions with that of providing practical performance. Traditional methods of partitioning graphs rely heavily on combinatorics, but recent developments in continuous optimization formulations have led to the development of hybrid methods that combine the best of both approaches. This work describes numerical optimization formulations for two classes of graph partitioning problems, edge cuts and vertex separators. Optimization-based formulations for each of these problems are described, and hybrid algorithms combining these optimization-based approaches with traditional combinatoric methods are presented. Efficient implementations and computational results for these algorithms are presented in a C++ graph partitioning library competitive with the state of the art. Additionally, an optimization-based approach to hypergraph partitioning is proposed

    Data mining using the crossing minimization paradigm

    Get PDF
    Our ability and capacity to generate, record and store multi-dimensional, apparently unstructured data is increasing rapidly, while the cost of data storage is going down. The data recorded is not perfect, as noise gets introduced in it from different sources. Some of the basic forms of noise are incorrect recording of values and missing values. The formal study of discovering useful hidden information in the data is called Data Mining. Because of the size, and complexity of the problem, practical data mining problems are best attempted using automatic means. Data Mining can be categorized into two types i.e. supervised learning or classification and unsupervised learning or clustering. Clustering only the records in a database (or data matrix) gives a global view of the data and is called one-way clustering. For a detailed analysis or a local view, biclustering or co-clustering or two-way clustering is required involving the simultaneous clustering of the records and the attributes. In this dissertation, a novel fast and white noise tolerant data mining solution is proposed based on the Crossing Minimization (CM) paradigm; the solution works for one-way as well as two-way clustering for discovering overlapping biclusters. For decades the CM paradigm has traditionally been used for graph drawing and VLSI (Very Large Scale Integration) circuit design for reducing wire length and congestion. The utility of the proposed technique is demonstrated by comparing it with other biclustering techniques using simulated noisy, as well as real data from Agriculture, Biology and other domains. Two other interesting and hard problems also addressed in this dissertation are (i) the Minimum Attribute Subset Selection (MASS) problem and (ii) Bandwidth Minimization (BWM) problem of sparse matrices. The proposed CM technique is demonstrated to provide very convincing results while attempting to solve the said problems using real public domain data. Pakistan is the fourth largest supplier of cotton in the world. An apparent anomaly has been observed during 1989-97 between cotton yield and pesticide consumption in Pakistan showing unexpected periods of negative correlation. By applying the indigenous CM technique for one-way clustering to real Agro-Met data (2001-2002), a possible explanation of the anomaly has been presented in this thesis.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
    • …
    corecore