2,570 research outputs found

    Strategy Derivation for Small Progress Measures

    Full text link
    Small Progress Measures is one of the most efficient parity game solving algorithms. The original algorithm provides the full solution (winning regions and strategies) in O(dmβ‹…(n/⌈d/2βŒ‰)⌈d/2βŒ‰)O(dm \cdot (n/\lceil d / 2 \rceil)^{\lceil d/2 \rceil}) time, and requires a re-run of the algorithm on one of the winning regions. We provide a novel operational interpretation of progress measures, and modify the algorithm so that it derives the winning strategies for both players in one pass. This reduces the upper bound on strategy derivation for SPM to O(dmβ‹…(n/⌊d/2βŒ‹)⌊d/2βŒ‹)O(dm \cdot (n/\lfloor d / 2 \rfloor)^{\lfloor d/2 \rfloor}).Comment: polished the tex

    A Theory of Formal Synthesis via Inductive Learning

    Full text link
    Formal synthesis is the process of generating a program satisfying a high-level formal specification. In recent times, effective formal synthesis methods have been proposed based on the use of inductive learning. We refer to this class of methods that learn programs from examples as formal inductive synthesis. In this paper, we present a theoretical framework for formal inductive synthesis. We discuss how formal inductive synthesis differs from traditional machine learning. We then describe oracle-guided inductive synthesis (OGIS), a framework that captures a family of synthesizers that operate by iteratively querying an oracle. An instance of OGIS that has had much practical impact is counterexample-guided inductive synthesis (CEGIS). We present a theoretical characterization of CEGIS for learning any program that computes a recursive language. In particular, we analyze the relative power of CEGIS variants where the types of counterexamples generated by the oracle varies. We also consider the impact of bounded versus unbounded memory available to the learning algorithm. In the special case where the universe of candidate programs is finite, we relate the speed of convergence to the notion of teaching dimension studied in machine learning theory. Altogether, the results of the paper take a first step towards a theoretical foundation for the emerging field of formal inductive synthesis
    • …
    corecore