3,678 research outputs found

    A Two-loop Test of Buscher's T-duality I

    Full text link
    We study the two loop quantum equivalence of sigma models related by Buscher's T-duality transformation. The computation of the two loop perturbative free energy density is performed in the case of a certain deformation of the SU(2) principal sigma model, and its T-dual, using dimensional regularization and the geometric sigma model perturbation theory. We obtain agreement between the free energy density expressions of the two models.Comment: 28 pp, Latex, references adde

    Deterministic and Probabilistic Binary Search in Graphs

    Full text link
    We consider the following natural generalization of Binary Search: in a given undirected, positively weighted graph, one vertex is a target. The algorithm's task is to identify the target by adaptively querying vertices. In response to querying a node qq, the algorithm learns either that qq is the target, or is given an edge out of qq that lies on a shortest path from qq to the target. We study this problem in a general noisy model in which each query independently receives a correct answer with probability p>12p > \frac{1}{2} (a known constant), and an (adversarial) incorrect one with probability 1p1-p. Our main positive result is that when p=1p = 1 (i.e., all answers are correct), log2n\log_2 n queries are always sufficient. For general pp, we give an (almost information-theoretically optimal) algorithm that uses, in expectation, no more than (1δ)log2n1H(p)+o(logn)+O(log2(1/δ))(1 - \delta)\frac{\log_2 n}{1 - H(p)} + o(\log n) + O(\log^2 (1/\delta)) queries, and identifies the target correctly with probability at leas 1δ1-\delta. Here, H(p)=(plogp+(1p)log(1p))H(p) = -(p \log p + (1-p) \log(1-p)) denotes the entropy. The first bound is achieved by the algorithm that iteratively queries a 1-median of the nodes not ruled out yet; the second bound by careful repeated invocations of a multiplicative weights algorithm. Even for p=1p = 1, we show several hardness results for the problem of determining whether a target can be found using KK queries. Our upper bound of log2n\log_2 n implies a quasipolynomial-time algorithm for undirected connected graphs; we show that this is best-possible under the Strong Exponential Time Hypothesis (SETH). Furthermore, for directed graphs, or for undirected graphs with non-uniform node querying costs, the problem is PSPACE-complete. For a semi-adaptive version, in which one may query rr nodes each in kk rounds, we show membership in Σ2k1\Sigma_{2k-1} in the polynomial hierarchy, and hardness for Σ2k5\Sigma_{2k-5}

    Theoretically Efficient Parallel Graph Algorithms Can Be Fast and Scalable

    Full text link
    There has been significant recent interest in parallel graph processing due to the need to quickly analyze the large graphs available today. Many graph codes have been designed for distributed memory or external memory. However, today even the largest publicly-available real-world graph (the Hyperlink Web graph with over 3.5 billion vertices and 128 billion edges) can fit in the memory of a single commodity multicore server. Nevertheless, most experimental work in the literature report results on much smaller graphs, and the ones for the Hyperlink graph use distributed or external memory. Therefore, it is natural to ask whether we can efficiently solve a broad class of graph problems on this graph in memory. This paper shows that theoretically-efficient parallel graph algorithms can scale to the largest publicly-available graphs using a single machine with a terabyte of RAM, processing them in minutes. We give implementations of theoretically-efficient parallel algorithms for 20 important graph problems. We also present the optimizations and techniques that we used in our implementations, which were crucial in enabling us to process these large graphs quickly. We show that the running times of our implementations outperform existing state-of-the-art implementations on the largest real-world graphs. For many of the problems that we consider, this is the first time they have been solved on graphs at this scale. We have made the implementations developed in this work publicly-available as the Graph-Based Benchmark Suite (GBBS).Comment: This is the full version of the paper appearing in the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA), 201

    MIMO free-space optical communication employing subcarrier intensity modulation in atmospheric turbulence channels

    Get PDF
    In this paper, we analyse the error performance of transmitter/receiver array free-space optical (FSO) communication system employing binary phase shift keying (BPSK) subcarrier intensity modulation (SIM) in clear but turbulent atmospheric channel. Subcarrier modulation is employed to eliminate the need for adaptive threshold detector. Direct detection is employed at the receiver and each subcarrier is subsequently demodulated coherently. The effect of irradiance fading is mitigated with an array of lasers and photodetectors. The received signals are linearly combined using the optimal maximum ratio combining (MRC), the equal gain combining (EGC) and the selection combining (SelC). The bit error rate (BER) equations are derived considering additive white Gaussian noise and log normal intensity fluctuations. This work is part of the EU COST actions and EU projects

    Impact of processed earwigs and their faeces on the aroma and taste of 'Chasselas' and 'Pinot Noir' wines

    Get PDF
    The abundance of the European earwig Forficula auricularia L. (Dermaptera, Forficulidae) in European vineyards increased considerably over the last few years. Although earwigs are omnivorous predators that prey on viticultural pests such as grape moths, they are also known to erode berries and to transfer fungal spores. Moreover, they are suspected to affect the human perception of wines both directly by their processing with the grapes and indirectly by the contamination of grape clusters with their faeces. In this study we artificially contaminated grapes with F. auricularia adults and/or their faeces and determined the impact on aroma and taste of white 'Chasselas' and red 'Pinot noir' wines. Whereas the addition of five living adults/kg grapes affected the olfactory sensation of 'Chasselas' wines only marginally, 0.6 gram of earwig faeces/kg grapes had a strong effect on colour, aroma and the general appreciation of 'Chasselas' wines. Faeces-contaminated wines were less fruity and less floral, the aroma was described as faecal and they were judged to be of lower quality. The contamination of 'Pinot noir' grapes with four different densities of living earwig adults (e.g. 0, 5, 10 and 20 individuals/kg grapes) showed that only wines contaminated with more than 10 earwigs/kg grapes smelled and tasted significantly different than the uncontaminated control wine. Earwig-contaminated 'Pinot noir' wines were judged to be of lower quality. The descriptors “animal”, “reductive”, “vegetal”, “acidic”, “bitter” and “tannic” characterised their sensory perception. In conclusion, our results show that there is a real risk of wine contamination by F. auricularia. In particular, earwig faeces and earwig adults at densities above a threshold of 5 to 10 individuals/kg grapes have the potential to reduce the quality of wines. The evolution of earwig populations in vineyards should therefore be monitored carefully in order to anticipate problems during vinification.

    Optimal Vertex Cover for the Small-World Hanoi Networks

    Full text link
    The vertex-cover problem on the Hanoi networks HN3 and HN5 is analyzed with an exact renormalization group and parallel-tempering Monte Carlo simulations. The grand canonical partition function of the equivalent hard-core repulsive lattice-gas problem is recast first as an Ising-like canonical partition function, which allows for a closed set of renormalization group equations. The flow of these equations is analyzed for the limit of infinite chemical potential, at which the vertex-cover problem is attained. The relevant fixed point and its neighborhood are analyzed, and non-trivial results are obtained both, for the coverage as well as for the ground state entropy density, which indicates the complex structure of the solution space. Using special hierarchy-dependent operators in the renormalization group and Monte-Carlo simulations, structural details of optimal configurations are revealed. These studies indicate that the optimal coverages (or packings) are not related by a simple symmetry. Using a clustering analysis of the solutions obtained in the Monte Carlo simulations, a complex solution space structure is revealed for each system size. Nevertheless, in the thermodynamic limit, the solution landscape is dominated by one huge set of very similar solutions.Comment: RevTex, 24 pages; many corrections in text and figures; final version; for related information, see http://www.physics.emory.edu/faculty/boettcher

    Approximate Deadline-Scheduling with Precedence Constraints

    Full text link
    We consider the classic problem of scheduling a set of n jobs non-preemptively on a single machine. Each job j has non-negative processing time, weight, and deadline, and a feasible schedule needs to be consistent with chain-like precedence constraints. The goal is to compute a feasible schedule that minimizes the sum of penalties of late jobs. Lenstra and Rinnoy Kan [Annals of Disc. Math., 1977] in their seminal work introduced this problem and showed that it is strongly NP-hard, even when all processing times and weights are 1. We study the approximability of the problem and our main result is an O(log k)-approximation algorithm for instances with k distinct job deadlines

    A high-throughput in vivo micronucleus assay for genome instability screening in mice.

    Get PDF
    We describe a sensitive, robust, high-throughput method for quantifying the formation of micronuclei, markers of genome instability, in mouse erythrocytes. Micronuclei are whole chromosomes or chromosome segments that have been separated from the nucleus. Other methods of detection rely on labor-intensive, microscopy-based techniques. Here we describe a 2-d, 96-well plate-based flow cytometric method of micronucleus scoring that is simple enough for a research technician experienced in flow cytometry to perform. The assay detects low levels of genome instability that cannot be readily identified by classic phenotyping, using 25 μl of blood. By using this assay, we have screened >10,000 blood samples and discovered novel genes that contribute to vertebrate genome maintenance, as well as novel disease models and mechanisms of genome instability disorders. We discuss experimental design considerations, including statistical power calculation, we provide troubleshooting tips and we discuss factors that contribute to a false-positive increase in the number of micronucleated red blood cells and to experimental variability.Acknowledgments We thank M. Hitcham and N. Harman for assistance with blood collections, W. Cheng for assistance with flow cytometry during high-throughput screening and K. Dry for comments on the manuscript. R.E.M. is supported by Cancer Research UK (CRUK; project grant C20510/A12401). D.J.A. is supported by CRUK. D.J.A. and B.L.N. are supported by the Wellcome Trust. Research in the Jackson Laboratory is funded by CRUK program grant no. C6/A11224, the European Research Council and the European Community Seventh Framework Programme grant agreement no. HEALTH-F2-2010-259893 (DDResponse). Core funding is provided by CRUK (C6946/A14492) and the Wellcome Trust (WT092096). S.P.J. receives his salary from the University of Cambridge, UK, supplemented by CRUK. G.B. is funded by CRUK program grant no. C6/A11224.This is the accepted manuscript for a paper published in Nature Protocols 10, 205–215 (2015) doi:10.1038/nprot.2015.010, Published online 31 December 201

    Application of regulatory sequence analysis and metabolic network analysis to the interpretation of gene expression data

    Get PDF
    We present two complementary approaches for the interpretation of clusters of co-regulated genes, such as those obtained from DNA chips and related methods. Starting from a cluster of genes with similar expression profiles, two basic questions can be asked: 1. Which mechanism is responsible for the coordinated transcriptional response of the genes? This question is approached by extracting motifs that are shared between the upstream sequences of these genes. The motifs extracted are putative cis-acting regulatory elements. 2. What is the physiological meaning for the cell to express together these genes? One way to answer the question is to search for potential metabolic pathways that could be catalyzed by the products of the genes. This can be done by selecting the genes from the cluster that code for enzymes, and trying to assemble the catalyzed reactions to form metabolic pathways. We present tools to answer these two questions, and we illustrate their use with selected examples in the yeast Saccharomyces cerevisiae. The tools are available on the web (http://ucmb.ulb.ac.be/bioinformatics/rsa-tools/; http://www.ebi.ac.uk/research/pfbp/; http://www.soi.city.ac.uk/~msch/)
    corecore