Skip to main content
Article thumbnail
Location of Repository

On the Proper Learning of Axis-Parallel Concepts

By Nader H. Bshouty, Lynn Burroughs and Dana Ron

Abstract

We study the proper learnability of axis-parallel concept classes in the PAC-learning and exactlearning models. These classes include union of boxes, DNF, decision trees and multivariate polynomials. For constant-dimensional axis-parallel concepts C we show that the following problems have time complexities that are within a polynomial factor of each other. 1. C is α-properly exactly learnable (with hypotheses of size at most α times the target size) from membership and equivalence queries. 2. C is α-properly PAC learnable (without membership queries) under any product distribution. 3. There is an α-approximation algorithm for the MINEQUIC problem (given a g ∈ C find a minimal size f ∈ C that is logically equivalent to g). In particular, if one has polynomial time complexity, they all do. Using this we give the first proper-learning algorithm of constant-dimensional decision trees and the first negative results in proper learning from membership and equivalence queries for many classes. For axis-parallel concepts over a nonconstant dimension we show that with the equivalence oracle (1) ⇒ (3). We use this to show that (binary) decision trees are not properly learnable in polynomial time (assuming P�=NP) and DNF is not sε-properly learnable (ε < 1) in polynomial time even with an NP-oracle (assuming ΣP 2 � = PNP)

Topics: PAC learning, exact learning, axis-parallel objects, minimizing formula size, Boolean
Year: 2003
OAI identifier: oai:CiteSeerX.psu:10.1.1.134.6903
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cs.technion.ac.il/~... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.