19,983 research outputs found

    Information Inequalities for the Bayes Risk

    Get PDF
    This paper presents lower bounds, derived from the information inequality, for the Bayes risk under scaled quadratic loss. Some numerical results are also presented which give some idea concerning the precision of these bounds. An appendix contains a proof of the information inequality without conditions on the estimator. This result is a direct extension of an earlier result of Fabian and Hannan

    Fast learning rates for plug-in classifiers under the margin condition

    Get PDF
    It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than n−1/2n^{-1/2}. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order n−1n^{-1}, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {\it super-fast} rates, i.e., the rates faster than n−1n^{-1}. We establish minimax lower bounds showing that the obtained rates cannot be improved.Comment: 36 page
    • …
    corecore