4 research outputs found

    Efficient, noise-tolerant, and private learning via boosting

    Full text link
    We introduce a simple framework for designing private boosting algorithms. We give natural conditions under which these algorithms are differentially private, efficient, and noise-tolerant PAC learners. To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension. We give two sample complexity bounds for our large-margin halfspace learner. One bound is based only on differential privacy, and uses this guarantee as an asset for ensuring generalization. This first bound illustrates a general methodology for obtaining PAC learners from privacy, which may be of independent interest. The second bound uses standard techniques from the theory of large-margin classification (the fat-shattering dimension) to match the best known sample complexity for differentially private learning of large-margin halfspaces, while additionally tolerating random label noise.https://arxiv.org/pdf/2002.01100.pd

    PAC Analogues of Perceptron and Winnow via Boosting the Margin

    No full text
    We describe a novel family of PAC model algorithms for learning linear threshold functions. The new algorithms work by boosting a simple weak learner and exhibit complexity bounds remarkably similar to those of known online algorithms such as Perceptron and Winnow, thus suggesting that these well-studied online algorithms in some sense correspond to instances of boosting. We show that the new algorithms can be viewed as natural PAC analogues of the online ¡-norm algorithms which have recently been studied by Grove, Littlestone, and Schuurmans [16] and Gentile and Littlestone [15]. As special cases of the algorithm, by taking ¡£¢¥ ¤ and ¡£¢¥ ¦ we obtain natural boostingbased PAC analogues of Perceptron and Winnow respectively. The ¡§¢¨ ¦ case of our algorithm can also be viewed as a generalization (with an improved sample complexity bound) of Jackson and Craven’s PAC-model boosting-based algorithm for learning “sparse perceptrons ” [20]. The analysis of the generalization error of the new algorithms relies on techniques from the theory of large margin classification.
    corecore