353 research outputs found

    Worst case and probabilistic analysis of the 2-Opt algorithm for the TSP

    Get PDF
    2-Opt is probably the most basic local search heuristic for the TSP. This heuristic achieves amazingly good results on “real world” Euclidean instances both with respect to running time and approximation ratio. There are numerous experimental studies on the performance of 2-Opt. However, the theoretical knowledge about this heuristic is still very limited. Not even its worst case running time on 2-dimensional Euclidean instances was known so far. We clarify this issue by presenting, for every p∈N , a family of L p instances on which 2-Opt can take an exponential number of steps. Previous probabilistic analyses were restricted to instances in which n points are placed uniformly at random in the unit square [0,1]2, where it was shown that the expected number of steps is bounded by O~(n10) for Euclidean instances. We consider a more advanced model of probabilistic instances in which the points can be placed independently according to general distributions on [0,1] d , for an arbitrary d≄2. In particular, we allow different distributions for different points. We study the expected number of local improvements in terms of the number n of points and the maximal density ϕ of the probability distributions. We show an upper bound on the expected length of any 2-Opt improvement path of O~(n4+1/3⋅ϕ8/3) . When starting with an initial tour computed by an insertion heuristic, the upper bound on the expected number of steps improves even to O~(n4+1/3−1/d⋅ϕ8/3) . If the distances are measured according to the Manhattan metric, then the expected number of steps is bounded by O~(n4−1/d⋅ϕ) . In addition, we prove an upper bound of O(ϕ√d) on the expected approximation factor with respect to all L p metrics. Let us remark that our probabilistic analysis covers as special cases the uniform input model with ϕ=1 and a smoothed analysis with Gaussian perturbations of standard deviation σ with ϕ∌1/σ d

    Superstatistical distributions from a maximum entropy principle

    Full text link
    We deal with a generalized statistical description of nonequilibrium complex systems based on least biased distributions given some prior information. A maximum entropy principle is introduced that allows for the determination of the distribution of the fluctuating intensive parameter ÎČ\beta of a superstatistical system, given certain constraints on the complex system under consideration. We apply the theory to three examples: The superstatistical quantum mechanical harmonic oscillator, the superstatistical classical ideal gas, and velocity time series as measured in a turbulent Taylor-Couette flow

    Portal protein diversity and phage ecology

    Get PDF
    © 2008 The Authors. This article is distributed under the terms of the Creative Commons License, Attribution 2.5. The definitive version was published in Environmental Microbiology 10 (2008): 2810-2823, doi:10.1111/j.1462-2920.2008.01702.x.Oceanic phages are critical components of the global ecosystem, where they play a role in microbial mortality and evolution. Our understanding of phage diversity is greatly limited by the lack of useful genetic diversity measures. Previous studies, focusing on myophages that infect the marine cyanobacterium Synechococcus, have used the coliphage T4 portal-protein-encoding homologue, gene 20 (g20), as a diversity marker. These studies revealed 10 sequence clusters, 9 oceanic and 1 freshwater, where only 3 contained cultured representatives. We sequenced g20 from 38 marine myophages isolated using a diversity of Synechococcus and Prochlorococcus hosts to see if any would fall into the clusters that lacked cultured representatives. On the contrary, all fell into the three clusters that already contained sequences from cultured phages. Further, there was no obvious relationship between host of isolation, or host range, and g20 sequence similarity. We next expanded our analyses to all available g20 sequences (769 sequences), which include PCR amplicons from wild uncultured phages, non-PCR amplified sequences identified in the Global Ocean Survey (GOS) metagenomic database, as well as sequences from cultured phages, to evaluate the relationship between g20 sequence clusters and habitat features from which the phage sequences were isolated. Even in this meta-data set, very few sequences fell into the sequence clusters without cultured representatives, suggesting that the latter are very rare, or sequencing artefacts. In contrast, sequences most similar to the culture-containing clusters, the freshwater cluster and two novel clusters, were more highly represented, with one particular culture-containing cluster representing the dominant g20 genotype in the unamplified GOS sequence data. Finally, while some g20 sequences were non-randomly distributed with respect to habitat, there were always numerous exceptions to general patterns, indicating that phage portal proteins are not good predictors of a phage's host or the habitat in which a particular phage may thrive.This research was supported in part by funding from NSF (CMORE contribution #87), DOE, The Seaver Foundation and the Gordon and Betty Moore Foundation Marine Microbiology Program to S.W.C.; an NIH Bioinformatics Training Grant supported M.B.S.; MIT Undergraduate Research Opportunities Program supported V.Q., J.A.L., G.T., R.F. and J.E.R.; Howard Hughes Medical Institute funded MIT Biology Department Undergraduate Research Opportunities Program supported A.S.D.; NSERC (Canada) Discovery Grant (DG 298394) and a Grant from the Canadian Foundation for Innovation (NOF10394) to J.P.B.; NSF Graduate Fellowship funding supported M.L.C

    Corrigendum "Portal protein diversity and phage ecology"

    Get PDF
    Author Posting. © The Author(s), 2011. This is the author's version of the work. It is posted here by permission of John Wiley & Sons for personal use, not for redistribution. The definitive version was published in Environmental Microbiology 13 (2011): 2832, doi:10.1111/j.1462-2920.2011.02616.x

    Algebraic Systems and Pushdown Automata

    Full text link
    The theory of algebraic power series in noncommuting variables, as we un-derstand it today, was initiated in [2] and developed in its early stages by the French school. The main motivation was the interconnection with context-free grammars: the defining equations were made to correspond to context-fre

    Multi-dimensional modeling and simulation of semiconductor nanophotonic devices

    Get PDF
    Self-consistent modeling and multi-dimensional simulation of semiconductor nanophotonic devices is an important tool in the development of future integrated light sources and quantum devices. Simulations can guide important technological decisions by revealing performance bottlenecks in new device concepts, contribute to their understanding and help to theoretically explore their optimization potential. The efficient implementation of multi-dimensional numerical simulations for computer-aided design tasks requires sophisticated numerical methods and modeling techniques. We review recent advances in device-scale modeling of quantum dot based single-photon sources and laser diodes by self-consistently coupling the optical Maxwell equations with semiclassical carrier transport models using semi-classical and fully quantum mechanical descriptions of the optically active region, respectively. For the simulation of realistic devices with complex, multi-dimensional geometries, we have developed a novel hp-adaptive finite element approach for the optical Maxwell equations, using mixed meshes adapted to the multi-scale properties of the photonic structures. For electrically driven devices, we introduced novel discretization and parameter-embedding techniques to solve the drift-diffusion system for strongly degenerate semiconductors at cryogenic temperature. Our methodical advances are demonstrated on various applications, including vertical-cavity surface-emitting lasers, grating couplers and single-photon sources
    • 

    corecore