1 research outputs found

    Learning using Local Membership Queries under Smooth Distributions

    No full text
    Weintroduceanewmodelofmembershipquery(MQ) learning,wherethelearningalgorithm is restricted to query points that are close to random examples drawn from the underlying distribution. The learning model is intermediate between the PAC model (Valiant, 1984) and the PAC+MQ model (where the queries are allowed to be arbitrary points). Membership query algorithms are not popular among machine learning practitioners. Apart from the obvious difficulty of adaptively querying labellers, it has also been observed that querying unnatural points leads to increased noise from human labellers (Lang and Baum, 1992). This motivates our study of learning algorithms that make queries that are close to examples generated from the data distribution. We restrict our attention to functions defined on the n-dimensional Boolean hypercube and say that a membership query is local if its Hamming distance from some example in the (random) training data is at most O(log(n)). We show three positive learning results in this model: (i) The class of O(log(n))-depth decision trees is learnable under a large class of smooth distributions using O(log(n))-local queries. (ii) The class of polynomial-sized decision trees is learnable under product distributions using O(log(n))-local queries. (iii) The class of sparse polynomials (with coefficients in R) over {0,1} n is learnable under smooth distributions using O(log(n))-local queries
    corecore