2 research outputs found

    A Confident Majority Voting Strategy for Parallel and Modular Support Vector Machines

    No full text
    Abstract. Support vector machines (SVMs) has been accepted as a fashionable method in machine learning community, but it cannot be easily scaled to handle large scale problems for its time and space complexity that is around quadratic in the number of training samples. In this paper, confident majority voting (CMV) is proposed to scale SVMs (CMV-SVMs). CMV-SVMs decomposes a large-scale task into many smaller and simpler sub-problems in training phase and choose some confident classifiers to vote for the final outcome in test phase. CMV-SVMs is compared with standard SVMs and parallel SVMs combined by majority voting. Experiments on three problems show that the proposed algorithm can significantly reduce the overall time consumed in training and testing. More importantly, it produces classification accuracy that is almost the same as standard SVMs does.
    corecore