This is an electronic version of the paper presented at the Learning 2004, held in Spain on 2004Parallel perceptrons are a novel approach to the study of committee machines
that allows, among other things, for a fast training with minimal communications
between outputs and hidden units. Moreover, their training allows to naturally
de¯ne margins for hidden unit activations. In this work we shall show how to
use those margins to perform subsample selections over a given training set that
reduce training complexity while enhancing classi¯cation accuracy and allowing
for a balanced classi¯er performance when class sizes are greatly di®erent.With partial support of Spain's CICyT, TIC 01-57