18 research outputs found

    Queries revisited

    Get PDF
    AbstractWe begin with a brief tutorial on the problem of learning a finite concept class over a finite domain using membership queries and/or equivalence queries. We then sketch general results on the number of queries needed to learn a class of concepts, focusing on the various notions of combinatorial dimension that have been employed, including the teaching dimension, the exclusion dimension, the extended teaching dimension, the fingerprint dimension, the sample exclusion dimension, the Vapnik–Chervonenkis dimension, the abstract identification dimension, and the general dimension

    Improved Bounds on Quantum Learning Algorithms

    Full text link
    In this article we give several new results on the complexity of algorithms that learn Boolean functions from quantum queries and quantum examples. Hunziker et al. conjectured that for any class C of Boolean functions, the number of quantum black-box queries which are required to exactly identify an unknown function from C is O(logCγ^C)O(\frac{\log |C|}{\sqrt{{\hat{\gamma}}^{C}}}), where γ^C\hat{\gamma}^{C} is a combinatorial parameter of the class C. We essentially resolve this conjecture in the affirmative by giving a quantum algorithm that, for any class C, identifies any unknown function from C using O(logCloglogCγ^C)O(\frac{\log |C| \log \log |C|}{\sqrt{{\hat{\gamma}}^{C}}}) quantum black-box queries. We consider a range of natural problems intermediate between the exact learning problem (in which the learner must obtain all bits of information about the black-box function) and the usual problem of computing a predicate (in which the learner must obtain only one bit of information about the black-box function). We give positive and negative results on when the quantum and classical query complexities of these intermediate problems are polynomially related to each other. Finally, we improve the known lower bounds on the number of quantum examples (as opposed to quantum black-box queries) required for (ϵ,δ)(\epsilon,\delta)-PAC learning any concept class of Vapnik-Chervonenkis dimension d over the domain {0,1}n\{0,1\}^n from Ω(dn)\Omega(\frac{d}{n}) to Ω(1ϵlog1δ+d+dϵ)\Omega(\frac{1}{\epsilon}\log \frac{1}{\delta}+d+\frac{\sqrt{d}}{\epsilon}). This new lower bound comes closer to matching known upper bounds for classical PAC learning.Comment: Minor corrections. 18 pages. To appear in Quantum Information Processing. Requires: algorithm.sty, algorithmic.sty to buil

    Tight Bounds on Proper Equivalence Query Learning of DNF

    Full text link
    We prove a new structural lemma for partial Boolean functions ff, which we call the seed lemma for DNF. Using the lemma, we give the first subexponential algorithm for proper learning of DNF in Angluin's Equivalence Query (EQ) model. The algorithm has time and query complexity 2(O~n)2^{(\tilde{O}{\sqrt{n}})}, which is optimal. We also give a new result on certificates for DNF-size, a simple algorithm for properly PAC-learning DNF, and new results on EQ-learning logn\log n-term DNF and decision trees

    Conjunctions of Unate DNF Formulas: Learning and Structure

    Get PDF
    AbstractA central topic in query learning is to determine which classes of Boolean formulas are efficiently learnable with membership and equivalence queries. We consider the class Rkconsisting of conjunctions ofkunate DNF formulas. This class generalizes the class ofk-clause CNF formulas and the class of unate DNF formulas, both of which are known to be learnable in polynomial time with membership and equivalence queries. We prove that R2can be properly learned with a polynomial number of polynomial-size membership and equivalence queries, but can be properly learned in polynomial time with such queries if and only if P=NP. Thus the barrier to properly learning R2with membership and equivalence queries is computational rather than informational. Few results of this type are known. In our proofs, we use recent results of Hellersteinet al.(1997,J. Assoc. Comput. Mach.43(5), 840–862), characterizing the classes that are polynomial-query learnable, together with work of Bshouty on the monotone dimension of Boolean functions. We extend some of our results to Rkand pose open questions on learning DNF formulas of small monotone dimension. We also prove structural results for Rk. We construct, for any fixedk⩾2, a class of functionsfthat cannot be represented by any formula in Rk, but which cannot be “easily” shown to have this property. More precisely, for any functionfonnvariables in the class, the value offon any polynomial-size set of points in its domain is not a witness thatfcannot be represented by a formula in Rk. Our construction is based on BCH codes
    corecore