1,559 research outputs found

    Memetic Multilevel Hypergraph Partitioning

    Full text link
    Hypergraph partitioning has a wide range of important applications such as VLSI design or scientific computing. With focus on solution quality, we develop the first multilevel memetic algorithm to tackle the problem. Key components of our contribution are new effective multilevel recombination and mutation operations that provide a large amount of diversity. We perform a wide range of experiments on a benchmark set containing instances from application areas such VLSI, SAT solving, social networks, and scientific computing. Compared to the state-of-the-art hypergraph partitioning tools hMetis, PaToH, and KaHyPar, our new algorithm computes the best result on almost all instances

    High-Quality Hypergraph Partitioning

    Get PDF
    This dissertation focuses on computing high-quality solutions for the NP-hard balanced hypergraph partitioning problem: Given a hypergraph and an integer kk, partition its vertex set into kk disjoint blocks of bounded size, while minimizing an objective function over the hyperedges. Here, we consider the two most commonly used objectives: the cut-net metric and the connectivity metric. Since the problem is computationally intractable, heuristics are used in practice - the most prominent being the three-phase multi-level paradigm: During coarsening, the hypergraph is successively contracted to obtain a hierarchy of smaller instances. After applying an initial partitioning algorithm to the smallest hypergraph, contraction is undone and, at each level, refinement algorithms try to improve the current solution. With this work, we give a brief overview of the field and present several algorithmic improvements to the multi-level paradigm. Instead of using a logarithmic number of levels like traditional algorithms, we present two coarsening algorithms that create a hierarchy of (nearly) nn levels, where nn is the number of vertices. This makes consecutive levels as similar as possible and provides many opportunities for refinement algorithms to improve the partition. This approach is made feasible in practice by tailoring all algorithms and data structures to the nn-level paradigm, and developing lazy-evaluation techniques, caching mechanisms and early stopping criteria to speed up the partitioning process. Furthermore, we propose a sparsification algorithm based on locality-sensitive hashing that improves the running time for hypergraphs with large hyperedges, and show that incorporating global information about the community structure into the coarsening process improves quality. Moreover, we present a portfolio-based initial partitioning approach, and propose three refinement algorithms. Two are based on the Fiduccia-Mattheyses (FM) heuristic, but perform a highly localized search at each level. While one is designed for two-way partitioning, the other is the first FM-style algorithm that can be efficiently employed in the multi-level setting to directly improve kk-way partitions. The third algorithm uses max-flow computations on pairs of blocks to refine kk-way partitions. Finally, we present the first memetic multi-level hypergraph partitioning algorithm for an extensive exploration of the global solution space. All contributions are made available through our open-source framework KaHyPar. In a comprehensive experimental study, we compare KaHyPar with hMETIS, PaToH, Mondriaan, Zoltan-AlgD, and HYPE on a wide range of hypergraphs from several application areas. Our results indicate that KaHyPar, already without the memetic component, computes better solutions than all competing algorithms for both the cut-net and the connectivity metric, while being faster than Zoltan-AlgD and equally fast as hMETIS. Moreover, KaHyPar compares favorably with the current best graph partitioning system KaFFPa - both in terms of solution quality and running time

    Data Understanding Applied to Optimization

    Get PDF
    The goal of this research is to explore and develop software for supporting visualization and data analysis of search and optimization. Optimization is an ever-present problem in science. The theory of NP-completeness implies that the problems can only be resolved by increasingly smarter problem specific knowledge, possibly for use in some general purpose algorithms. Visualization and data analysis offers an opportunity to accelerate our understanding of key computational bottlenecks in optimization and to automatically tune aspects of the computation for specific problems. We will prototype systems to demonstrate how data understanding can be successfully applied to problems characteristic of NASA's key science optimization tasks, such as central tasks for parallel processing, spacecraft scheduling, and data transmission from a remote satellite

    A nonmonotone GRASP

    Get PDF
    A greedy randomized adaptive search procedure (GRASP) is an itera- tive multistart metaheuristic for difficult combinatorial optimization problems. Each GRASP iteration consists of two phases: a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Repeated applications of the con- struction procedure yields different starting solutions for the local search and the best overall solution is kept as the result. The GRASP local search applies iterative improvement until a locally optimal solution is found. During this phase, starting from the current solution an improving neighbor solution is accepted and considered as the new current solution. In this paper, we propose a variant of the GRASP framework that uses a new “nonmonotone” strategy to explore the neighborhood of the current solu- tion. We formally state the convergence of the nonmonotone local search to a locally optimal solution and illustrate the effectiveness of the resulting Nonmonotone GRASP on three classical hard combinatorial optimization problems: the maximum cut prob- lem (MAX-CUT), the weighted maximum satisfiability problem (MAX-SAT), and the quadratic assignment problem (QAP)

    A Multi-Exchange Neighborhood Search Heuristic for an Integrated Clustering and Machine Setup Model for PCB Manufacturing

    Get PDF
    In the manufacture of printed circuit boards, electronic components are attached to a blank board by one or more pick-and-place machines. Frequent machine setups, though time consuming, can reduce overall processing time. We consider the Integrated Clustering and Machine Setup (ICMS) model, which incorporates this tradeoff between processing time and setup time and seeks to minimize the sum of the two. Solving this model to optimality is intractable for very large-scale instances. We show that ICMS is NP-hard and consequently propose and test a heuristic based on multi-exchange neighborhood search structures. Initial numerical results are very encouraging. Keywords: Printed circuit board assembly, feeder slot assignment, product clustering, integer programming, computational complexity, heuristics

    Reliability Evaluation Method for Electronic Device BGA Package Considering the Interaction Between Design Factors

    Get PDF
    The recent development of electric and electronic devices has been remarkable. The miniaturization of electronic devices and high integration are progressing by advances in mounting technology. As a result, the reliability of fatigue life has been prioritized as an important concern, since the thermal expansion difference between a package and printed circuit board causes thermal fatigue. It is demanded a long-life product which has short development time. However, it is difficult because of interaction between each design factor. The authors have investigated the influence of various design factors on the reliability of soldered joints in BGA model by using response surface method and cluster analysis. By using these techniques, the interaction of all design factors was clarified. Based upon the analytical results, design engineers can rate each factor's effect on reliability and assess the reliability of their basic design plan at the concept design stage.Comment: Submitted on behalf of TIMA Editions (http://irevues.inist.fr/tima-editions

    A methodological approach to consumer research on second-hand fashion platforms in Italy: online clothing reselling platforms: perceptions and preferences of Italian consumers

    Get PDF
    The research focuses on second-hand fashion platforms (Vinted, Vestiaire Collective, Depop, Zalando Second-hand) in Italy from a consumer standpoint. The study assesses the platform's positioning, and most preferred characteristics, as well as the potential consumer segments in the Italian market. By conducting surveys with consumers, and applying market research techniques such as perceptual maps, conjoint analysis, and k-means clustering, we were able to learn consumers' perceptions, preferences, and their relevance to the platforms. The main discoveries are then used to suggest recommendations for the companies to improve their market presence and competitive edge
    corecore