197 research outputs found

    Edge Elimination in TSP Instances

    Full text link
    The Traveling Salesman Problem is one of the best studied NP-hard problems in combinatorial optimization. Powerful methods have been developed over the last 60 years to find optimum solutions to large TSP instances. The largest TSP instance so far that has been solved optimally has 85,900 vertices. Its solution required more than 136 years of total CPU time using the branch-and-cut based Concorde TSP code [1]. In this paper we present graph theoretic results that allow to prove that some edges of a TSP instance cannot occur in any optimum TSP tour. Based on these results we propose a combinatorial algorithm to identify such edges. The runtime of the main part of our algorithm is O(n2logn)O(n^2 \log n) for an n-vertex TSP instance. By combining our approach with the Concorde TSP solver we are able to solve a large TSPLIB instance more than 11 times faster than Concorde alone

    Coverage, Continuity and Visual Cortical Architecture

    Get PDF
    The primary visual cortex of many mammals contains a continuous representation of visual space, with a roughly repetitive aperiodic map of orientation preferences superimposed. It was recently found that orientation preference maps (OPMs) obey statistical laws which are apparently invariant among species widely separated in eutherian evolution. Here, we examine whether one of the most prominent models for the optimization of cortical maps, the elastic net (EN) model, can reproduce this common design. The EN model generates representations which optimally trade of stimulus space coverage and map continuity. While this model has been used in numerous studies, no analytical results about the precise layout of the predicted OPMs have been obtained so far. We present a mathematical approach to analytically calculate the cortical representations predicted by the EN model for the joint mapping of stimulus position and orientation. We find that in all previously studied regimes, predicted OPM layouts are perfectly periodic. An unbiased search through the EN parameter space identifies a novel regime of aperiodic OPMs with pinwheel densities lower than found in experiments. In an extreme limit, aperiodic OPMs quantitatively resembling experimental observations emerge. Stabilization of these layouts results from strong nonlocal interactions rather than from a coverage-continuity-compromise. Our results demonstrate that optimization models for stimulus representations dominated by nonlocal suppressive interactions are in principle capable of correctly predicting the common OPM design. They question that visual cortical feature representations can be explained by a coverage-continuity-compromise.Comment: 100 pages, including an Appendix, 21 + 7 figure

    Integration of Structural Constraints into TSP Models

    Get PDF
    International audienceSeveral models based on constraint programming have been proposed to solve the traveling salesman problem (TSP). The most efficient ones, such as the weighted circuit constraint (WCC), mainly rely on the Lagrangian relaxation of the TSP, based on the search for spanning tree or more precisely "1-tree". The weakness of these approaches is that they do not include enough structural constraints and are based almost exclusively on edge costs. The purpose of this paper is to correct this drawback by introducing the Hamiltonian cycle constraint associated with propagators. We propose some properties preventing the existence of a Hamiltonian cycle in a graph or, conversely, properties requiring that certain edges be in the TSP solution set. Notably, we design a propagator based on the research of k-cutsets. The combination of this constraint with the WCC constraint allows us to obtain, for the resolution of the TSP, gains of an order of magnitude for the number of backtracks as well as a strong reduction of the computation time

    Mechanisms of Hearing Loss after Blast Injury to the Ear

    Get PDF
    Given the frequent use of improvised explosive devices (IEDs) around the world, the study of traumatic blast injuries is of increasing interest. The ear is the most common organ affected by blast injury because it is the bodyメs most sensitive pressure transducer. We fabricated a blast chamber to re-create blast profiles similar to that of IEDs and used it to develop a reproducible mouse model to study blast-induced hearing loss. The tympanic membrane was perforated in all mice after blast exposure and found to heal spontaneously. Micro-computed tomography demonstrated no evidence for middle ear or otic capsule injuries; however, the healed tympanic membrane was thickened. Auditory brainstem response and distortion product otoacoustic emission threshold shifts were found to be correlated with blast intensity. As well, these threshold shifts were larger than those found in control mice that underwent surgical perforation of their tympanic membranes, indicating cochlear trauma. Histological studies one week and three months after the blast demonstrated no disruption or damage to the intra-cochlear membranes. However, there was loss of outer hair cells (OHCs) within the basal turn of the cochlea and decreased spiral ganglion neurons (SGNs) and afferent nerve synapses. Using our mouse model that recapitulates human IED exposure, our results identify that the mechanisms underlying blast-induced hearing loss does not include gross membranous rupture as is commonly believed. Instead, there is both OHC and SGN loss that produce auditory dysfunction

    Verifying integer programming results

    Get PDF
    Software for mixed-integer linear programming can return incorrect results for a number of reasons, one being the use of inexact floating-point arithmetic. Even solvers that employ exact arithmetic may suffer from programming or algorithmic errors, motivating the desire for a way to produce independently verifiable certificates of claimed results. Due to the complex nature of state-of-the-art MIP solution algorithms, the ideal form of such a certificate is not entirely clear. This paper proposes such a certificate format designed with simplicity in mind, which is composed of a list of statements that can be sequentially verified using a limited number of inference rules. We present a supplementary verification tool for compressing and checking these certificates independently of how they were created. We report computational results on a selection of MIP instances from the literature. To this end, we have extended the exact rational version of the MIP solver SCIP to produce such certificates

    Accounting for the mortality benefit of drug-eluting stents in percutaneous coronary intervention: a comparison of methods in a retrospective cohort study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Drug-eluting stents (DES) reduce rates of restenosis compared with bare metal stents (BMS). A number of observational studies have also found lower rates of mortality and non-fatal myocardial infarction with DES compared with BMS, findings not observed in randomized clinical trials. In order to explore reasons for this discrepancy, we compared outcomes after percutaneous coronary intervention (PCI) with DES or BMS by multiple statistical methods.</p> <p>Methods</p> <p>We compared short-term rates of all-cause mortality and myocardial infarction for patients undergoing PCI with DES or BMS using propensity-score adjustment, propensity-score matching, and a stent-era comparison in a large, integrated health system between 1998 and 2007. For the propensity-score adjustment and stent era comparisons, we used multivariable logistic regression to assess the association of stent type with outcomes. We used McNemar's Chi-square test to compare outcomes for propensity-score matching.</p> <p>Results</p> <p>Between 1998 and 2007, 35,438 PCIs with stenting were performed among health plan members (53.9% DES and 46.1% BMS). After propensity-score adjustment, DES was associated with significantly lower rates of death at 30 days (OR 0.49, 95% CI 0.39 - 0.63, <it>P </it>< 0.001) and one year (OR 0.58, 95% CI 0.49 - 0.68, <it>P </it>< 0.001), and a lower rate of myocardial infarction at one year (OR 0.72, 95% CI 0.59 - 0.87, <it>P </it>< 0.001). Thirty day and one year mortality were also lower with DES after propensity-score matching. However, a stent era comparison, which eliminates potential confounding by indication, showed no difference in death or myocardial infarction for DES and BMS, similar to results from randomized trials.</p> <p>Conclusions</p> <p>Although propensity-score methods suggested a mortality benefit with DES, consistent with prior observational studies, a stent era comparison failed to support this conclusion. Unobserved factors influencing stent selection in observational studies likely account for the observed mortality benefit of DES not seen in randomized clinical trials.</p

    Haiku - a Scala combinator toolkit for semi-automated composition of metaheuristics

    Get PDF
    There is an emerging trend towards the automated design of metaheuristics at the software component level. In principle, metaheuristics have a relatively clean decomposition, where well-known frameworks such as ILS and EA are parametrised by variant components for acceptance, perturbation etc. Automated generation of these frameworks is not so simple in practice, since the coupling between components may be implementation specific. Compositionality is the ability to freely express a space of designs ‘bottom up’ in terms of elementary components: previous work in this area has used combinators, a modular and functional approach to componentisation arising from foundational Computer Science. In this article, we describeHaiku, a combinator tool-kit written in the Scala language, which builds upon previous work to further automate the process by automatically composing the external dependencies of components. We provide examples of use and give a case study in which a programatically-generated heuristic is applied to the Travelling Salesman Problem within an Evolutionary Strategies framework

    Landscape Encodings Enhance Optimization

    Get PDF
    Hard combinatorial optimization problems deal with the search for the minimum cost solutions (ground states) of discrete systems under strong constraints. A transformation of state variables may enhance computational tractability. It has been argued that these state encodings are to be chosen invertible to retain the original size of the state space. Here we show how redundant non-invertible encodings enhance optimization by enriching the density of low-energy states. In addition, smooth landscapes may be established on encoded state spaces to guide local search dynamics towards the ground state

    Impact of impaired fractional flow reserve after coronary interventions on outcomes: a systematic review and meta-analysis

    Full text link
    BACKGROUND: FFR is routinely used to guide percutaneous coronary interventions (PCI). Visual assessment of the angiographic result after PCI has limited efficacy. Even when the angiographic result seems satisfactory FFR after a PCI might be useful for identifying patients with a suboptimal interventional result and higher risk for poor clinical outcome who might benefit from additional procedures. The aim of this meta-analysis was to investigate available data of studies that examined clinical outcomes of patients with impaired vs. satisfactory fractional flow reserve (FFR) after percutaneous coronary interventions (PCI). METHODS: This meta-analysis was carried out according to the Cochrane Handbook for Systematic Reviews. The Mantel-Haenszel method using the fixed-effect meta-analysis model was used for combining the results. Studies were identified by searching the literature through mid-January, 2016, using the following search terms: fractional flow reserve, coronary circulation, after, percutaneous coronary intervention, balloon angioplasty, stent implantation, and stenting. Primary endpoint was the rate of major adverse cardiac events (MACE). Secondary endpoints included rates of death, myocardial infarction (MI), repeated revascularisation. RESULTS: Eight relevant studies were found including a total of 1337 patients. Of those, 492 (36.8 %) had an impaired FFR after PCI, and 853 (63.2 %) had a satisfactory FFR after PCI. Odds ratios indicated that a low FFR following PCI was associated with an impaired outcome: major adverse cardiac events (MACE, OR: 4.95, 95 % confidence interval [CI]: 3.39–7.22, p <0.001); death (OR: 3.23, 95 % CI: 1.19–8.76, p = 0.022); myocardial infarction (OR: 13.83, 95 % CI: 4.75–40.24, p <0.0001) and repeated revascularisation (OR: 4.42, 95 % CI: 2.73–7.15, p <0.0001). CONCLUSIONS: Compared to a satisfactory FFR, a persistently low FFR following PCI is associated with a worse clinical outcome. Prospective studies are needed to identify underlying causes, determine an optimal threshold for post-PCI FFR, and clarify whether simple additional procedures can influence the post-PCI FFR and clinical outcome. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12872-016-0355-7) contains supplementary material, which is available to authorized users
    corecore