164 research outputs found

    Concrete resource analysis of the quantum linear system algorithm used to compute the electromagnetic scattering cross section of a 2D target

    Get PDF
    We provide a detailed estimate for the logical resource requirements of the quantum linear system algorithm (QLSA) [Phys. Rev. Lett. 103, 150502 (2009)] including the recently described elaborations [Phys. Rev. Lett. 110, 250504 (2013)]. Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width, circuit depth, the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set {X, Y, Z, H, S, T, CNOT}. To perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the example problem size N=332,020,680 beyond which, according to a crude big-O complexity comparison, QLSA is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy 0.01 requires an approximate circuit width 340 and circuit depth of order 102510^{25} if oracle costs are excluded, and a circuit width and depth of order 10810^8 and 102910^{29}, respectively, if oracle costs are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient advanced quantum-computation techniques are developed, they nevertheless provide a valid baseline for research targeting a reduction of the resource requirements, implying that a reduction by many orders of magnitude is necessary for the algorithm to become practical.Comment: 37 pages, 40 figure

    Hamilton decompositions of regular expanders: a proof of Kelly's conjecture for large tournaments

    Get PDF
    A long-standing conjecture of Kelly states that every regular tournament on n vertices can be decomposed into (n-1)/2 edge-disjoint Hamilton cycles. We prove this conjecture for large n. In fact, we prove a far more general result, based on our recent concept of robust expansion and a new method for decomposing graphs. We show that every sufficiently large regular digraph G on n vertices whose degree is linear in n and which is a robust outexpander has a decomposition into edge-disjoint Hamilton cycles. This enables us to obtain numerous further results, e.g. as a special case we confirm a conjecture of Erdos on packing Hamilton cycles in random tournaments. As corollaries to the main result, we also obtain several results on packing Hamilton cycles in undirected graphs, giving e.g. the best known result on a conjecture of Nash-Williams. We also apply our result to solve a problem on the domination ratio of the Asymmetric Travelling Salesman problem, which was raised e.g. by Glover and Punnen as well as Alon, Gutin and Krivelevich.Comment: new version includes a standalone version of the `robust decomposition lemma' for application in subsequent paper

    Counting Hamilton decompositions of oriented graphs

    Get PDF
    A Hamilton cycle in a directed graph G is a cycle that passes through every vertex of G. A Hamilton decomposition of G is a partition of its edge set into disjoint Hamilton cycles. In the late 60s Kelly conjectured that every regular tournament has a Hamilton decomposition. This conjecture was recently settled for large tournaments by Kuhn and Osthus [15], who proved more generally that every r-regular n-vertex oriented graph G (without antiparallel edges) with r = cn for some fixed c > 3=8 has a Hamilton decomposition, provided n = n(c) is sufficiently large. In this paper we address the natural question of estimating the number of such decompositions of G and show that this number is n^(1-o(1))cn2. In addition, we also obtain a new and much simpler proof for the approximate version of Kelly's conjecture.</p

    Mining and modeling graphs using patterns and priors

    No full text

    Combinatorics and Probability

    Get PDF
    For the past few decades, Combinatorics and Probability Theory have had a fruitful symbiosis, each benefitting from and influencing developments in the other. Thus to prove the existence of designs, probabilistic methods are used, algorithms to factorize integers need combinatorics and probability theory (in addition to number theory), and the study of random matrices needs combinatorics. In the workshop a great variety of topics exemplifying this interaction were considered, including problems concerning designs, Cayley graphs, additive number theory, multiplicative number theory, noise sensitivity, random graphs, extremal graphs and random matrices
    corecore