4 research outputs found

    PAC-Bayesian High Dimensional Bipartite Ranking

    Get PDF
    This paper is devoted to the bipartite ranking problem, a classical statistical learning task, in a high dimensional setting. We propose a scoring and ranking strategy based on the PAC-Bayesian approach. We consider nonlinear additive scoring functions, and we derive non-asymptotic risk bounds under a sparsity assumption. In particular, oracle inequalities in probability holding under a margin condition assess the performance of our procedure, and prove its minimax optimality. An MCMC-flavored algorithm is proposed to implement our method, along with its behavior on synthetic and real-life datasets

    Minimax learning rates for bipartite ranking and plug-in rules

    No full text
    While it is now well-known in the standard binary classification setup, that, under suitable margin assumptions and complexity conditions on the regression function, fast or even super-fast rates (i.e. rates faster than n −1/2 or even faster than n −1) can be achieved by plug-in classifiers, no result of this nature has been proved yet in the context of bipartite ranking, though akin to that of classification. It is the main purpose of the present paper to investigate this issue, by considering bipartite ranking as a nested continuous collection of cost-sensitive classification problems. A global low noise condition is exhibited under which certain (plugin) ranking rules are proved to achieve fast (but not super-fast) rates over a wide nonparametric class of models. A lower bound result is also stated in a specific situation, establishing that such rates are optimal from a minimax perspective. 1
    corecore