research

Classification with Minimax Fast Rates for Classes of Bayes Rules with Sparse Representation

Abstract

We construct a classifier which attains the rate of convergence logn/n\log n/n under sparsity and margin assumptions. An approach close to the one met in approximation theory for the estimation of function is used to obtain this result. The idea is to develop the Bayes rule in a fundamental system of L2([0,1]d)L^2([0,1]^d) made of indicator of dyadic sets and to assume that coefficients, equal to 1,0or1-1,0 {or} 1, belong to a kind of L1L^1-ball. This assumption can be seen as a sparsity assumption, in the sense that the proportion of coefficients non equal to zero decreases as "frequency" grows. Finally, rates of convergence are obtained by using an usual trade-off between a bias term and a variance term

    Similar works