86 research outputs found

    Hyper-polynomial hierarchies and the polynomial jump

    Get PDF
    AbstractAssuming that the polynomial hierarchy (PH) does not collapse, we show the existence of ascending sequences of ptime Turing degrees of length ω1CK in PSPACE such that successors are polynomial jumps of their predecessors. Moreover these ptime degrees are all uniformly hard for PH. This is analogous to the hyperarithmetic hierarchy, which is defined similarly but with the (computable) Turing degrees. The lack of uniform least upper bounds for ascending sequences of ptime degrees causes the limit levels of our hyper-polynomial hierarchy to be inherently non-canonical. This problem is investigated in depth, and various possible structures for hyper-polynomial hierarchies are explicated, as are properties of the polynomial jump operator on the languages which are in PSPACE but not in PH

    The Power of Quantum Fourier Sampling

    Get PDF
    A line of work initiated by Terhal and DiVincenzo and Bremner, Jozsa, and Shepherd, shows that quantum computers can efficiently sample from probability distributions that cannot be exactly sampled efficiently on a classical computer, unless the PH collapses. Aaronson and Arkhipov take this further by considering a distribution that can be sampled efficiently by linear optical quantum computation, that under two feasible conjectures, cannot even be approximately sampled classically within bounded total variation distance, unless the PH collapses. In this work we use Quantum Fourier Sampling to construct a class of distributions that can be sampled by a quantum computer. We then argue that these distributions cannot be approximately sampled classically, unless the PH collapses, under variants of the Aaronson and Arkhipov conjectures. In particular, we show a general class of quantumly sampleable distributions each of which is based on an "Efficiently Specifiable" polynomial, for which a classical approximate sampler implies an average-case approximation. This class of polynomials contains the Permanent but also includes, for example, the Hamiltonian Cycle polynomial, and many other familiar #P-hard polynomials. Although our construction, unlike that proposed by Aaronson and Arkhipov, likely requires a universal quantum computer, we are able to use this additional power to weaken the conjectures needed to prove approximate sampling hardness results

    Study of fault tolerant software technology for dynamic systems

    Get PDF
    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented

    Speedup for Natural Problems and Noncomputability

    Get PDF
    A resource-bounded version of the statement "no algorithm recognizes all non-halting Turing machines" is equivalent to an infinitely often (i.o.) superpolynomial speedup for the time required to accept any coNP-complete language and also equivalent to a superpolynomial speedup in proof length in propositional proof systems for tautologies, each of which implies P!=NP. This suggests a correspondence between the properties 'has no algorithm at all' and 'has no best algorithm' which seems relevant to open problems in computational and proof complexity.Comment: 8 page

    Minimal Coverage for Ontology Signatures

    Get PDF

    Faster Algorithms for the Geometric Transportation Problem

    Get PDF
    Let R, B be a set of n points in R^d, for constant d, where the points of R have integer supplies, points of B have integer demands, and the sum of supply is equal to the sum of demand. Let d(.,.) be a suitable distance function such as the L_p distance. The transportation problem asks to find a map tau : R x B --> N such that sum_{b in B}tau(r,b) = supply(r), sum_{r in R}tau(r,b) = demand(b), and sum_{r in R, b in B} tau(r,b) d(r,b) is minimized. We present three new results for the transportation problem when d(.,.) is any L_p metric: * For any constant epsilon > 0, an O(n^{1+epsilon}) expected time randomized algorithm that returns a transportation map with expected cost O(log^2(1/epsilon)) times the optimal cost. * For any epsilon > 0, a (1+epsilon)-approximation in O(n^{3/2}epsilon^{-d}polylog(U)polylog(n)) time, where U is the maximum supply or demand of any point. * An exact strongly polynomial O(n^2 polylog n) time algorithm, for d = 2

    Low-Complexity Cryptographic Hash Functions

    Get PDF
    Cryptographic hash functions are efficiently computable functions that shrink a long input into a shorter output while achieving some of the useful security properties of a random function. The most common type of such hash functions is collision resistant hash functions (CRH), which prevent an efficient attacker from finding a pair of inputs on which the function has the same output
    • …
    corecore