234 research outputs found

    Decomposing highly edge-connected graphs into homomorphic copies of a fixed tree

    Get PDF
    The Tree Decomposition Conjecture by Bar\'at and Thomassen states that for every tree TT there exists a natural number k(T)k(T) such that the following holds: If GG is a k(T)k(T)-edge-connected simple graph with size divisible by the size of TT, then GG can be edge-decomposed into subgraphs isomorphic to TT. So far this conjecture has only been verified for paths, stars, and a family of bistars. We prove a weaker version of the Tree Decomposition Conjecture, where we require the subgraphs in the decomposition to be isomorphic to graphs that can be obtained from TT by vertex-identifications. We call such a subgraph a homomorphic copy of TT. This implies the Tree Decomposition Conjecture under the additional constraint that the girth of GG is greater than the diameter of TT. As an application, we verify the Tree Decomposition Conjecture for all trees of diameter at most 4.Comment: 18 page

    Graph Decompositions

    Get PDF

    Graph Theory

    Get PDF
    This workshop focused on recent developments in graph theory. These included in particular recent breakthroughs on nowhere-zero flows in graphs, width parameters, applications of graph sparsity in algorithms, and matroid structure results

    Topologie

    Get PDF
    The Oberwolfach conference “Topologie” is one of only a few opportunities for researchers from many different areas in algebraic and geometric topology to meet and exchange ideas. The program covered new developments in fields such as automorphisms of manifolds, applications of algebraic topology to differential geometry, quantum field theories, combinatorial methods in low-dimensional topology, abstract and applied homotopy theory and applications of L2-cohomology. We heard about new results describing the cohomology of the automorphism spaces of some smooth manifolds, progress on spaces of positive scalar curvature metrics, a variant of the Segal conjecture without completion, advances in classifying topological quantum field theories, and a new undecidability result in combinatorial group theory, to mention some examples. As a special attraction, the conference featured a series of three talks by Dani Wise on the combinatorics of CAT(0)-cube complexes and applications to 3-manifold topology

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Algorithmic Meta-Theorems

    Full text link
    Algorithmic meta-theorems are general algorithmic results applying to a whole range of problems, rather than just to a single problem alone. They often have a "logical" and a "structural" component, that is they are results of the form: every computational problem that can be formalised in a given logic L can be solved efficiently on every class C of structures satisfying certain conditions. This paper gives a survey of algorithmic meta-theorems obtained in recent years and the methods used to prove them. As many meta-theorems use results from graph minor theory, we give a brief introduction to the theory developed by Robertson and Seymour for their proof of the graph minor theorem and state the main algorithmic consequences of this theory as far as they are needed in the theory of algorithmic meta-theorems

    Contributions to the Theory of Finite-State Based Grammars

    Get PDF
    This dissertation is a theoretical study of finite-state based grammars used in natural language processing. The study is concerned with certain varieties of finite-state intersection grammars (FSIG) whose parsers define regular relations between surface strings and annotated surface strings. The study focuses on the following three aspects of FSIGs: (i) Computational complexity of grammars under limiting parameters In the study, the computational complexity in practical natural language processing is approached through performance-motivated parameters on structural complexity. Each parameter splits some grammars in the Chomsky hierarchy into an infinite set of subset approximations. When the approximations are regular, they seem to fall into the logarithmic-time hierarchyand the dot-depth hierarchy of star-free regular languages. This theoretical result is important and possibly relevant to grammar induction. (ii) Linguistically applicable structural representations Related to the linguistically applicable representations of syntactic entities, the study contains new bracketing schemes that cope with dependency links, left- and right branching, crossing dependencies and spurious ambiguity. New grammar representations that resemble the Chomsky-SchĂĽtzenberger representation of context-free languages are presented in the study, and they include, in particular, representations for mildly context-sensitive non-projective dependency grammars whose performance-motivated approximations are linear time parseable. (iii) Compilation and simplification of linguistic constraints Efficient compilation methods for certain regular operations such as generalized restriction are presented. These include an elegant algorithm that has already been adopted as the approach in a proprietary finite-state tool. In addition to the compilation methods, an approach to on-the-fly simplifications of finite-state representations for parse forests is sketched. These findings are tightly coupled with each other under the theme of locality. I argue that the findings help us to develop better, linguistically oriented formalisms for finite-state parsing and to develop more efficient parsers for natural language processing. Avainsanat: syntactic parsing, finite-state automata, dependency grammar, first-order logic, linguistic performance, star-free regular approximations, mildly context-sensitive grammar

    Sublinear Computation Paradigm

    Get PDF
    This open access book gives an overview of cutting-edge work on a new paradigm called the “sublinear computation paradigm,” which was proposed in the large multiyear academic research project “Foundations of Innovative Algorithms for Big Data.” That project ran from October 2014 to March 2020, in Japan. To handle the unprecedented explosion of big data sets in research, industry, and other areas of society, there is an urgent need to develop novel methods and approaches for big data analysis. To meet this need, innovative changes in algorithm theory for big data are being pursued. For example, polynomial-time algorithms have thus far been regarded as “fast,” but if a quadratic-time algorithm is applied to a petabyte-scale or larger big data set, problems are encountered in terms of computational resources or running time. To deal with this critical computational and algorithmic bottleneck, linear, sublinear, and constant time algorithms are required. The sublinear computation paradigm is proposed here in order to support innovation in the big data era. A foundation of innovative algorithms has been created by developing computational procedures, data structures, and modelling techniques for big data. The project is organized into three teams that focus on sublinear algorithms, sublinear data structures, and sublinear modelling. The work has provided high-level academic research results of strong computational and algorithmic interest, which are presented in this book. The book consists of five parts: Part I, which consists of a single chapter on the concept of the sublinear computation paradigm; Parts II, III, and IV review results on sublinear algorithms, sublinear data structures, and sublinear modelling, respectively; Part V presents application results. The information presented here will inspire the researchers who work in the field of modern algorithms
    • …
    corecore