163 research outputs found
Edge Elimination in TSP Instances
The Traveling Salesman Problem is one of the best studied NP-hard problems in
combinatorial optimization. Powerful methods have been developed over the last
60 years to find optimum solutions to large TSP instances. The largest TSP
instance so far that has been solved optimally has 85,900 vertices. Its
solution required more than 136 years of total CPU time using the
branch-and-cut based Concorde TSP code [1]. In this paper we present graph
theoretic results that allow to prove that some edges of a TSP instance cannot
occur in any optimum TSP tour. Based on these results we propose a
combinatorial algorithm to identify such edges. The runtime of the main part of
our algorithm is for an n-vertex TSP instance. By combining our
approach with the Concorde TSP solver we are able to solve a large TSPLIB
instance more than 11 times faster than Concorde alone
Collaborative Delivery with Energy-Constrained Mobile Robots
We consider the problem of collectively delivering some message from a
specified source to a designated target location in a graph, using multiple
mobile agents. Each agent has a limited energy which constrains the distance it
can move. Hence multiple agents need to collaborate to move the message, each
agent handing over the message to the next agent to carry it forward. Given the
positions of the agents in the graph and their respective budgets, the problem
of finding a feasible movement schedule for the agents can be challenging. We
consider two variants of the problem: in non-returning delivery, the agents can
stop anywhere; whereas in returning delivery, each agent needs to return to its
starting location, a variant which has not been studied before.
We first provide a polynomial-time algorithm for returning delivery on trees,
which is in contrast to the known (weak) NP-hardness of the non-returning
version. In addition, we give resource-augmented algorithms for returning
delivery in general graphs. Finally, we give tight lower bounds on the required
resource augmentation for both variants of the problem. In this sense, our
results close the gap left by previous research.Comment: 19 pages. An extended abstract of this paper was published at the
23rd International Colloquium on Structural Information and Communication
Complexity 2016, SIROCCO'1
Integration of Structural Constraints into TSP Models
International audienceSeveral models based on constraint programming have been proposed to solve the traveling salesman problem (TSP). The most efficient ones, such as the weighted circuit constraint (WCC), mainly rely on the Lagrangian relaxation of the TSP, based on the search for spanning tree or more precisely "1-tree". The weakness of these approaches is that they do not include enough structural constraints and are based almost exclusively on edge costs. The purpose of this paper is to correct this drawback by introducing the Hamiltonian cycle constraint associated with propagators. We propose some properties preventing the existence of a Hamiltonian cycle in a graph or, conversely, properties requiring that certain edges be in the TSP solution set. Notably, we design a propagator based on the research of k-cutsets. The combination of this constraint with the WCC constraint allows us to obtain, for the resolution of the TSP, gains of an order of magnitude for the number of backtracks as well as a strong reduction of the computation time
Branching on multi-aggregated variables
open5siopenGamrath, Gerald; Melchiori, Anna; Berthold, Timo; Gleixner, Ambros M.; Salvagnin, DomenicoGamrath, Gerald; Melchiori, Anna; Berthold, Timo; Gleixner, Ambros M.; Salvagnin, Domenic
Accounting for the mortality benefit of drug-eluting stents in percutaneous coronary intervention: a comparison of methods in a retrospective cohort study
<p>Abstract</p> <p>Background</p> <p>Drug-eluting stents (DES) reduce rates of restenosis compared with bare metal stents (BMS). A number of observational studies have also found lower rates of mortality and non-fatal myocardial infarction with DES compared with BMS, findings not observed in randomized clinical trials. In order to explore reasons for this discrepancy, we compared outcomes after percutaneous coronary intervention (PCI) with DES or BMS by multiple statistical methods.</p> <p>Methods</p> <p>We compared short-term rates of all-cause mortality and myocardial infarction for patients undergoing PCI with DES or BMS using propensity-score adjustment, propensity-score matching, and a stent-era comparison in a large, integrated health system between 1998 and 2007. For the propensity-score adjustment and stent era comparisons, we used multivariable logistic regression to assess the association of stent type with outcomes. We used McNemar's Chi-square test to compare outcomes for propensity-score matching.</p> <p>Results</p> <p>Between 1998 and 2007, 35,438 PCIs with stenting were performed among health plan members (53.9% DES and 46.1% BMS). After propensity-score adjustment, DES was associated with significantly lower rates of death at 30 days (OR 0.49, 95% CI 0.39 - 0.63, <it>P </it>< 0.001) and one year (OR 0.58, 95% CI 0.49 - 0.68, <it>P </it>< 0.001), and a lower rate of myocardial infarction at one year (OR 0.72, 95% CI 0.59 - 0.87, <it>P </it>< 0.001). Thirty day and one year mortality were also lower with DES after propensity-score matching. However, a stent era comparison, which eliminates potential confounding by indication, showed no difference in death or myocardial infarction for DES and BMS, similar to results from randomized trials.</p> <p>Conclusions</p> <p>Although propensity-score methods suggested a mortality benefit with DES, consistent with prior observational studies, a stent era comparison failed to support this conclusion. Unobserved factors influencing stent selection in observational studies likely account for the observed mortality benefit of DES not seen in randomized clinical trials.</p
Verifying integer programming results
Software for mixed-integer linear programming can return incorrect results for a number of reasons, one being the use of inexact floating-point arithmetic. Even solvers that employ exact arithmetic may suffer from programming or algorithmic errors, motivating the desire for a way to produce independently verifiable certificates of claimed results. Due to the complex nature of state-of-the-art MIP solution algorithms, the ideal form of such a certificate is not entirely clear. This paper proposes such a certificate format designed with simplicity in mind, which is composed of a list of statements that can be sequentially verified using a limited number of inference rules. We present a supplementary verification tool for compressing and checking these certificates independently of how they were created. We report computational results on a selection of MIP instances from the literature. To this end, we have extended the exact rational version of the MIP solver SCIP to produce such certificates
Haiku - a Scala combinator toolkit for semi-automated composition of metaheuristics
There is an emerging trend towards the automated design of metaheuristics at the software component level. In principle, metaheuristics have a relatively clean decomposition, where well-known frameworks such as ILS and EA are parametrised by variant components for acceptance, perturbation etc. Automated generation of these frameworks is not so simple in practice, since the coupling between components may be implementation specific. Compositionality is the ability to freely express a space of designs ‘bottom up’ in terms of elementary components: previous work in this area has used combinators, a modular and functional approach to componentisation arising from foundational Computer Science. In this article, we describeHaiku, a combinator tool-kit written in the Scala language, which builds upon previous work to further automate the process by automatically composing the external dependencies of components. We provide examples of use and give a case study in which a programatically-generated heuristic is applied to the Travelling Salesman Problem within an Evolutionary Strategies framework
Coverage, Continuity and Visual Cortical Architecture
The primary visual cortex of many mammals contains a continuous
representation of visual space, with a roughly repetitive aperiodic map of
orientation preferences superimposed. It was recently found that orientation
preference maps (OPMs) obey statistical laws which are apparently invariant
among species widely separated in eutherian evolution. Here, we examine whether
one of the most prominent models for the optimization of cortical maps, the
elastic net (EN) model, can reproduce this common design. The EN model
generates representations which optimally trade of stimulus space coverage and
map continuity. While this model has been used in numerous studies, no
analytical results about the precise layout of the predicted OPMs have been
obtained so far. We present a mathematical approach to analytically calculate
the cortical representations predicted by the EN model for the joint mapping of
stimulus position and orientation. We find that in all previously studied
regimes, predicted OPM layouts are perfectly periodic. An unbiased search
through the EN parameter space identifies a novel regime of aperiodic OPMs with
pinwheel densities lower than found in experiments. In an extreme limit,
aperiodic OPMs quantitatively resembling experimental observations emerge.
Stabilization of these layouts results from strong nonlocal interactions rather
than from a coverage-continuity-compromise. Our results demonstrate that
optimization models for stimulus representations dominated by nonlocal
suppressive interactions are in principle capable of correctly predicting the
common OPM design. They question that visual cortical feature representations
can be explained by a coverage-continuity-compromise.Comment: 100 pages, including an Appendix, 21 + 7 figure
Landscape Encodings Enhance Optimization
Hard combinatorial optimization problems deal with the search for the minimum cost solutions (ground states) of discrete systems under strong constraints. A transformation of state variables may enhance computational tractability. It has been argued that these state encodings are to be chosen invertible to retain the original size of the state space. Here we show how redundant non-invertible encodings enhance optimization by enriching the density of low-energy states. In addition, smooth landscapes may be established on encoded state spaces to guide local search dynamics towards the ground state
- …