research

Memory-efficient large-scale linear support vector machine

Abstract

Stochastic gradient descent has been advanced as a computationally efficient method for large-scale problems. In classification problems, many proposed linear support vector machines as very effective classifiers. However, they assume that the data is already in memory which might not be always the case. Recent work suggests a classical method that divides such a problem into smaller blocks and then solves the sub-problems iteratively. We show that a simple modification of shrinking the dataset early will produce significant saving in computation and memory. We further find that on problems larger than previously considered, our approach is able to reach solutions on top-end desktop machines while competing methods cannot

    Similar works

    Full text

    thumbnail-image

    Available Versions