171 research outputs found

    Tight Bounds on Proper Equivalence Query Learning of DNF

    Full text link
    We prove a new structural lemma for partial Boolean functions ff, which we call the seed lemma for DNF. Using the lemma, we give the first subexponential algorithm for proper learning of DNF in Angluin's Equivalence Query (EQ) model. The algorithm has time and query complexity 2(O~n)2^{(\tilde{O}{\sqrt{n}})}, which is optimal. We also give a new result on certificates for DNF-size, a simple algorithm for properly PAC-learning DNF, and new results on EQ-learning logn\log n-term DNF and decision trees

    A Survey of Quantum Learning Theory

    Get PDF
    This paper surveys quantum learning theory: the theoretical aspects of machine learning using quantum computers. We describe the main results known for three models of learning: exact learning from membership queries, and Probably Approximately Correct (PAC) and agnostic learning from classical or quantum examples.Comment: 26 pages LaTeX. v2: many small changes to improve the presentation. This version will appear as Complexity Theory Column in SIGACT News in June 2017. v3: fixed a small ambiguity in the definition of gamma(C) and updated a referenc

    Queries revisited

    Get PDF
    AbstractWe begin with a brief tutorial on the problem of learning a finite concept class over a finite domain using membership queries and/or equivalence queries. We then sketch general results on the number of queries needed to learn a class of concepts, focusing on the various notions of combinatorial dimension that have been employed, including the teaching dimension, the exclusion dimension, the extended teaching dimension, the fingerprint dimension, the sample exclusion dimension, the Vapnik–Chervonenkis dimension, the abstract identification dimension, and the general dimension

    Learning Boolean Halfspaces with Small Weights from Membership Queries

    Full text link
    We consider the problem of proper learning a Boolean Halfspace with integer weights {0,1,,t}\{0,1,\ldots,t\} from membership queries only. The best known algorithm for this problem is an adaptive algorithm that asks nO(t5)n^{O(t^5)} membership queries where the best lower bound for the number of membership queries is ntn^t [Learning Threshold Functions with Small Weights Using Membership Queries. COLT 1999] In this paper we close this gap and give an adaptive proper learning algorithm with two rounds that asks nO(t)n^{O(t)} membership queries. We also give a non-adaptive proper learning algorithm that asks nO(t3)n^{O(t^3)} membership queries
    corecore