11,015 research outputs found

    Survey on Combinatorial Register Allocation and Instruction Scheduling

    Full text link
    Register allocation (mapping variables to processor registers or memory) and instruction scheduling (reordering instructions to increase instruction-level parallelism) are essential tasks for generating efficient assembly code in a compiler. In the last three decades, combinatorial optimization has emerged as an alternative to traditional, heuristic algorithms for these two tasks. Combinatorial optimization approaches can deliver optimal solutions according to a model, can precisely capture trade-offs between conflicting decisions, and are more flexible at the expense of increased compilation time. This paper provides an exhaustive literature review and a classification of combinatorial optimization approaches to register allocation and instruction scheduling, with a focus on the techniques that are most applied in this context: integer programming, constraint programming, partitioned Boolean quadratic programming, and enumeration. Researchers in compilers and combinatorial optimization can benefit from identifying developments, trends, and challenges in the area; compiler practitioners may discern opportunities and grasp the potential benefit of applying combinatorial optimization

    Reducing combinatorial uncertainties: A new technique based on MT2 variables

    Get PDF
    We propose a new method to resolve combinatorial ambiguities in hadron collider events involving two invisible particles in the final state. This method is based on the kinematic variable MT2 and on the MT2-assisted-on-shell reconstruction of invisible momenta, that are reformulated as `test' variables Ti of the correct combination against the incorrect ones. We show how the efficiency of the single Ti in providing the correct answer can be systematically improved by combining the different Ti and/or by introducing cuts on suitable, combination-insensitive kinematic variables. We illustrate our whole approach in the specific example of top anti-top production, followed by a leptonic decay of the W on both sides. However, by construction, our method is also directly applicable to many topologies of interest for new physics, in particular events producing a pair of undetected particles, that are potential dark-matter candidates. We finally emphasize that our method is apt to several generalizations, that we outline in the last sections of the paper.Comment: 1+23 pages, 8 figures. Main changes in v3: (1) discussion at the end of sec. 2 improved; (2) added sec. 4.2 about the method's dependence on mass information. Matches journal versio

    Memory read faults: taxonomy and automatic test generation

    Get PDF
    This paper presents an innovative algorithm for the automatic generation of March tests. The proposed approach is able to generate an optimal March test for an unconstrained set of memory faults in very low computation time. Moreover, we propose a new complete taxonomy for memory read faults, a class of faults never carefully addressed in the past

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    There are 174 Subdivisions of the Hexahedron into Tetrahedra

    Full text link
    This article answers an important theoretical question: How many different subdivisions of the hexahedron into tetrahedra are there? It is well known that the cube has five subdivisions into 6 tetrahedra and one subdivision into 5 tetrahedra. However, all hexahedra are not cubes and moving the vertex positions increases the number of subdivisions. Recent hexahedral dominant meshing methods try to take these configurations into account for combining tetrahedra into hexahedra, but fail to enumerate them all: they use only a set of 10 subdivisions among the 174 we found in this article. The enumeration of these 174 subdivisions of the hexahedron into tetrahedra is our combinatorial result. Each of the 174 subdivisions has between 5 and 15 tetrahedra and is actually a class of 2 to 48 equivalent instances which are identical up to vertex relabeling. We further show that exactly 171 of these subdivisions have a geometrical realization, i.e. there exist coordinates of the eight hexahedron vertices in a three-dimensional space such that the geometrical tetrahedral mesh is valid. We exhibit the tetrahedral meshes for these configurations and show in particular subdivisions of hexahedra with 15 tetrahedra that have a strictly positive Jacobian

    Submodular memetic approximation for multiobjective parallel test paper generation

    Get PDF
    Parallel test paper generation is a biobjective distributed resource optimization problem, which aims to generate multiple similarly optimal test papers automatically according to multiple user-specified assessment criteria. Generating high-quality parallel test papers is challenging due to its NP-hardness in both of the collective objective functions. In this paper, we propose a submodular memetic approximation algorithm for solving this problem. The proposed algorithm is an adaptive memetic algorithm (MA), which exploits the submodular property of the collective objective functions to design greedy-based approximation algorithms for enhancing steps of the multiobjective MA. Synergizing the intensification of submodular local search mechanism with the diversification of the population-based submodular crossover operator, our algorithm can jointly optimize the total quality maximization objective and the fairness quality maximization objective. Our MA can achieve provable near-optimal solutions in a huge search space of large datasets in efficient polynomial runtime. Performance results on various datasets have shown that our algorithm has drastically outperformed the current techniques in terms of paper quality and runtime efficiency
    • 

    corecore