9,781 research outputs found

    Exact Algorithms for Solving Stochastic Games

    Full text link
    Shapley's discounted stochastic games, Everett's recursive games and Gillette's undiscounted stochastic games are classical models of game theory describing two-player zero-sum games of potentially infinite duration. We describe algorithms for exactly solving these games

    Implications of quantum automata for contextuality

    Full text link
    We construct zero-error quantum finite automata (QFAs) for promise problems which cannot be solved by bounded-error probabilistic finite automata (PFAs). Here is a summary of our results: - There is a promise problem solvable by an exact two-way QFA in exponential expected time, but not by any bounded-error sublogarithmic space probabilistic Turing machine (PTM). - There is a promise problem solvable by an exact two-way QFA in quadratic expected time, but not by any bounded-error o(loglogn) o(\log \log n) -space PTMs in polynomial expected time. The same problem can be solvable by a one-way Las Vegas (or exact two-way) QFA with quantum head in linear (expected) time. - There is a promise problem solvable by a Las Vegas realtime QFA, but not by any bounded-error realtime PFA. The same problem can be solvable by an exact two-way QFA in linear expected time but not by any exact two-way PFA. - There is a family of promise problems such that each promise problem can be solvable by a two-state exact realtime QFAs, but, there is no such bound on the number of states of realtime bounded-error PFAs solving the members this family. Our results imply that there exist zero-error quantum computational devices with a \emph{single qubit} of memory that cannot be simulated by any finite memory classical computational model. This provides a computational perspective on results regarding ontological theories of quantum mechanics \cite{Hardy04}, \cite{Montina08}. As a consequence we find that classical automata based simulation models \cite{Kleinmann11}, \cite{Blasiak13} are not sufficiently powerful to simulate quantum contextuality. We conclude by highlighting the interplay between results from automata models and their application to developing a general framework for quantum contextuality.Comment: 22 page

    Polynomial tuning of multiparametric combinatorial samplers

    Full text link
    Boltzmann samplers and the recursive method are prominent algorithmic frameworks for the approximate-size and exact-size random generation of large combinatorial structures, such as maps, tilings, RNA sequences or various tree-like structures. In their multiparametric variants, these samplers allow to control the profile of expected values corresponding to multiple combinatorial parameters. One can control, for instance, the number of leaves, profile of node degrees in trees or the number of certain subpatterns in strings. However, such a flexible control requires an additional non-trivial tuning procedure. In this paper, we propose an efficient polynomial-time, with respect to the number of tuned parameters, tuning algorithm based on convex optimisation techniques. Finally, we illustrate the efficiency of our approach using several applications of rational, algebraic and P\'olya structures including polyomino tilings with prescribed tile frequencies, planar trees with a given specific node degree distribution, and weighted partitions.Comment: Extended abstract, accepted to ANALCO2018. 20 pages, 6 figures, colours. Implementation and examples are available at [1] https://github.com/maciej-bendkowski/boltzmann-brain [2] https://github.com/maciej-bendkowski/multiparametric-combinatorial-sampler

    List Decoding Algorithm based on Voting in Groebner Bases for General One-Point AG Codes

    Get PDF
    We generalize the unique decoding algorithm for one-point AG codes over the Miura-Kamiya Cab curves proposed by Lee, Bras-Amor\'os and O'Sullivan (2012) to general one-point AG codes, without any assumption. We also extend their unique decoding algorithm to list decoding, modify it so that it can be used with the Feng-Rao improved code construction, prove equality between its error correcting capability and half the minimum distance lower bound by Andersen and Geil (2008) that has not been done in the original proposal except for one-point Hermitian codes, remove the unnecessary computational steps so that it can run faster, and analyze its computational complexity in terms of multiplications and divisions in the finite field. As a unique decoding algorithm, the proposed one is empirically and theoretically as fast as the BMS algorithm for one-point Hermitian codes. As a list decoding algorithm, extensive experiments suggest that it can be much faster for many moderate size/usual inputs than the algorithm by Beelen and Brander (2010). It should be noted that as a list decoding algorithm the proposed method seems to have exponential worst-case computational complexity while the previous proposals (Beelen and Brander, 2010; Guruswami and Sudan, 1999) have polynomial ones, and that the proposed method is expected to be slower than the previous proposals for very large/special inputs.Comment: Accepted for publication in J. Symbolic Computation. LaTeX2e article.cls, 42 pages, 4 tables, no figures. Ver. 6 added an illustrative example of the algorithm executio

    On Buffon Machines and Numbers

    Get PDF
    The well-know needle experiment of Buffon can be regarded as an analog (i.e., continuous) device that stochastically "computes" the number 2/pi ~ 0.63661, which is the experiment's probability of success. Generalizing the experiment and simplifying the computational framework, we consider probability distributions, which can be produced perfectly, from a discrete source of unbiased coin flips. We describe and analyse a few simple Buffon machines that generate geometric, Poisson, and logarithmic-series distributions. We provide human-accessible Buffon machines, which require a dozen coin flips or less, on average, and produce experiments whose probabilities of success are expressible in terms of numbers such as, exp(-1), log 2, sqrt(3), cos(1/4), aeta(5). Generally, we develop a collection of constructions based on simple probabilistic mechanisms that enable one to design Buffon experiments involving compositions of exponentials and logarithms, polylogarithms, direct and inverse trigonometric functions, algebraic and hypergeometric functions, as well as functions defined by integrals, such as the Gaussian error function.Comment: Largely revised version with references and figures added. 12 pages. In ACM-SIAM Symposium on Discrete Algorithms (SODA'2011

    The computational complexity of the Chow form

    Full text link
    We present a bounded probability algorithm for the computation of the Chow forms of the equidimensional components of an algebraic variety. Its complexity is polynomial in the length and in the geometric degree of the input equation system defining the variety. In particular, it provides an alternative algorithm for the equidimensional decomposition of a variety. As an application we obtain an algorithm for the computation of a subclass of sparse resultants, whose complexity is polynomial in the dimension and the volume of the input set of exponents. As a further application, we derive an algorithm for the computation of the (unique) solution of a generic over-determined equation system.Comment: 60 pages, Latex2

    Symbolic and analytic techniques for resource analysis of Java bytecode

    Get PDF
    Recent work in resource analysis has translated the idea of amortised resource analysis to imperative languages using a program logic that allows mixing of assertions about heap shapes, in the tradition of separation logic, and assertions about consumable resources. Separately, polyhedral methods have been used to calculate bounds on numbers of iterations in loop-based programs. We are attempting to combine these ideas to deal with Java programs involving both data structures and loops, focusing on the bytecode level rather than on source code
    corecore