70,714 research outputs found

    Learning curves for Soft Margin Classifiers

    Full text link
    Typical learning curves for Soft Margin Classifiers (SMCs) learning both realizable and unrealizable tasks are determined using the tools of Statistical Mechanics. We derive the analytical behaviour of the learning curves in the regimes of small and large training sets. The generalization errors present different decay laws towards the asymptotic values as a function of the training set size, depending on general geometrical characteristics of the rule to be learned. Optimal generalization curves are deduced through a fine tuning of the hyperparameter controlling the trade-off between the error and the regularization terms in the cost function. Even if the task is realizable, the optimal performance of the SMC is better than that of a hard margin Support Vector Machine (SVM) learning the same rule, and is very close to that of the Bayesian classifier.Comment: 26 pages, 10 figure

    Hybrid computational model to assist in the location of victims buried in the tragedy of Brumadinho

    Get PDF
    The rupture of dams in Brazil has caused great concern due to the environmental disaster and the loss of lives. The use of algorithms and computational models to assist search teams in locating victims when buried by tailings is essential but scarce. Those that exist are mainly slow, as they involve high computational costs. In this sense, in the context of the Brumadinho tragedy in 2019, this study aimed to develop a hybrid computational model to assist the search teams in locating victims buried by the tailings. The methodology for designing this model was based on regression techniques, machine learning, and physicomathematical algorithms. Firstly, the study resulted in a physicomathematical model based on integral and vector calculus and concepts of fluid mechanics, which provided results to assist in locating bodies buried by the tailings. More recently, based on data provided by the physicomathematical algorithm, two hybrid models have been developed. One uses statistical regression, and the other uses support vector regression, a type of machine learning. It is expected that a more accurate model can be used in other possible situations of disruption in future studies. Also, it is possible to apply the model developed in situations involving computational fluid dynamics in general.This paper presents a hybrid computational model  based on regression techniques, machine learning and  physicomathematical algorithms developed for assistance in locating victims in the Brumadinho tragedy in 2019. The physicomathematical model, which provided results to help search teams, is based on integral and vector calculus, and fluid mechanics concepts. In addition, from data provided by the physicomathematical algorithm, two hybrid model were developed. One of them uses regression statistical and the  other one uses support vector regression which is a type of machine learning. With good prospects of the advances in research, it is  expected in future work, a more accurate model that can be used in other possible situations of dam-break. Moreover the model can be applied to situations involving computational fluid dynamics in genera

    Statistical Mechanics of Soft Margin Classifiers

    Full text link
    We study the typical learning properties of the recently introduced Soft Margin Classifiers (SMCs), learning realizable and unrealizable tasks, with the tools of Statistical Mechanics. We derive analytically the behaviour of the learning curves in the regime of very large training sets. We obtain exponential and power laws for the decay of the generalization error towards the asymptotic value, depending on the task and on general characteristics of the distribution of stabilities of the patterns to be learned. The optimal learning curves of the SMCs, which give the minimal generalization error, are obtained by tuning the coefficient controlling the trade-off between the error and the regularization terms in the cost function. If the task is realizable by the SMC, the optimal performance is better than that of a hard margin Support Vector Machine and is very close to that of a Bayesian classifier.Comment: 26 pages, 12 figures, submitted to Physical Review

    Statistical Mechanics of Support Vector Networks

    Get PDF
    Using methods of Statistical Physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau, when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced, when the distribution of the inputs has a gap in feature space.Comment: REVTeX, 4 pages, 2 figures, accepted by Phys. Rev. Lett (typos corrected
    corecore