3,840 research outputs found

    Some Applications of Coding Theory in Computational Complexity

    Full text link
    Error-correcting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locally-testable and locally-decodable error-correcting codes, and their applications to complexity theory and to cryptography. Locally decodable codes are error-correcting codes with sub-linear time error-correcting algorithms. They are related to private information retrieval (a type of cryptographic protocol), and they are used in average-case complexity and to construct ``hard-core predicates'' for one-way permutations. Locally testable codes are error-correcting codes with sub-linear time error-detection algorithms, and they are the combinatorial core of probabilistically checkable proofs

    Quantum Algorithms for Learning and Testing Juntas

    Full text link
    In this article we develop quantum algorithms for learning and testing juntas, i.e. Boolean functions which depend only on an unknown set of k out of n input variables. Our aim is to develop efficient algorithms: - whose sample complexity has no dependence on n, the dimension of the domain the Boolean functions are defined over; - with no access to any classical or quantum membership ("black-box") queries. Instead, our algorithms use only classical examples generated uniformly at random and fixed quantum superpositions of such classical examples; - which require only a few quantum examples but possibly many classical random examples (which are considered quite "cheap" relative to quantum examples). Our quantum algorithms are based on a subroutine FS which enables sampling according to the Fourier spectrum of f; the FS subroutine was used in earlier work of Bshouty and Jackson on quantum learning. Our results are as follows: - We give an algorithm for testing k-juntas to accuracy ϵ\epsilon that uses O(k/ϵ)O(k/\epsilon) quantum examples. This improves on the number of examples used by the best known classical algorithm. - We establish the following lower bound: any FS-based k-junta testing algorithm requires Ω(k)\Omega(\sqrt{k}) queries. - We give an algorithm for learning kk-juntas to accuracy ϵ\epsilon that uses O(ϵ1klogk)O(\epsilon^{-1} k\log k) quantum examples and O(2klog(1/ϵ))O(2^k \log(1/\epsilon)) random examples. We show that this learning algorithms is close to optimal by giving a related lower bound.Comment: 15 pages, 1 figure. Uses synttree package. To appear in Quantum Information Processin

    Inferring Rankings Using Constrained Sensing

    Full text link
    We consider the problem of recovering a function over the space of permutations (or, the symmetric group) over nn elements from given partial information; the partial information we consider is related to the group theoretic Fourier Transform of the function. This problem naturally arises in several settings such as ranked elections, multi-object tracking, ranking systems, and recommendation systems. Inspired by the work of Donoho and Stark in the context of discrete-time functions, we focus on non-negative functions with a sparse support (support size \ll domain size). Our recovery method is based on finding the sparsest solution (through 0\ell_0 optimization) that is consistent with the available information. As the main result, we derive sufficient conditions for functions that can be recovered exactly from partial information through 0\ell_0 optimization. Under a natural random model for the generation of functions, we quantify the recoverability conditions by deriving bounds on the sparsity (support size) for which the function satisfies the sufficient conditions with a high probability as nn \to \infty. 0\ell_0 optimization is computationally hard. Therefore, the popular compressive sensing literature considers solving the convex relaxation, 1\ell_1 optimization, to find the sparsest solution. However, we show that 1\ell_1 optimization fails to recover a function (even with constant sparsity) generated using the random model with a high probability as nn \to \infty. In order to overcome this problem, we propose a novel iterative algorithm for the recovery of functions that satisfy the sufficient conditions. Finally, using an Information Theoretic framework, we study necessary conditions for exact recovery to be possible.Comment: 19 page
    corecore