106 research outputs found

    STOCK MARKET TREND PREDICTION USING SUPPORT VECTOR MACHINES

    Get PDF
    The aim of the paper was to outline a trend prediction model for the BELEX15 stock market index of the Belgrade stock exchange based on Support Vector Machines (SVMs). The feature selection was carried out through the analysis of technical and macroeconomics indicators. In addition, the SVM method was compared with a "similar" one, the least squares support vector machines - LS-SVMs to analyze their classification precisions and complexity. The test results indicate that the SVMs outperform benchmarking models and are suitable for short-term stock market trend predictions

    SMO-based pruning methods for sparse least squares support vector machines

    Get PDF
    Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments

    Improving the Solution of Least Squares Support Vector Machines with Application to a Blast Furnace System

    Get PDF
    The solution of least squares support vector machines (LS-SVMs) is characterized by a specific linear system, that is, a saddle point system. Approaches for its numerical solutions such as conjugate methods Sykens and Vandewalle (1999) and null space methods Chu et al. (2005) have been proposed. To speed up the solution of LS-SVM, this paper employs the minimal residual (MINRES) method to solve the above saddle point system directly. Theoretical analysis indicates that the MINRES method is more efficient than the conjugate gradient method and the null space method for solving the saddle point system. Experiments on benchmark data sets show that compared with mainstream algorithms for LS-SVM, the proposed approach significantly reduces the training time and keeps comparable accuracy. To heel, the LS-SVM based on MINRES method is used to track a practical problem originated from blast furnace iron-making process: changing trend prediction of silicon content in hot metal. The MINRES method-based LS-SVM can effectively perform feature reduction and model selection simultaneously, so it is a practical tool for the silicon trend prediction task

    Optimized parameter search for large datasets of the regularization parameter and feature selection for ridge regression

    Get PDF
    In this paper we propose mathematical optimizations to select the optimal regularization parameter for ridge regression using cross-validation. The resulting algorithm is suited for large datasets and the computational cost does not depend on the size of the training set. We extend this algorithm to forward or backward feature selection in which the optimal regularization parameter is selected for each possible feature set. These feature selection algorithms yield solutions with a sparse weight matrix using a quadratic cost on the norm of the weights. A naive approach to optimizing the ridge regression parameter has a computational complexity of the order with the number of applied regularization parameters, the number of folds in the validation set, the number of input features and the number of data samples in the training set. Our implementation has a computational complexity of the order . This computational cost is smaller than that of regression without regularization for large datasets and is independent of the number of applied regularization parameters and the size of the training set. Combined with a feature selection algorithm the algorithm is of complexity and for forward and backward feature selection respectively, with the number of selected features and the number of removed features. This is an order faster than and for the naive implementation, with for large datasets. To show the performance and reduction in computational cost, we apply this technique to train recurrent neural networks using the reservoir computing approach, windowed ridge regression, least-squares support vector machines (LS-SVMs) in primal space using the fixed-size LS-SVM approximation and extreme learning machines

    Comparison of partial least squares regression, least squares support vector machines, and Gaussian process regression for a near infrared calibration

    Get PDF
    This paper investigates the use of least squares support vector machines and Gaussian process regression for multivariate spectroscopic calibration. The performances of these two non-linear regression models are assessed and compared to the traditional linear regression model, partial least squares regression on an agricultural example. The non linear models, least squares support vector machines, and Gaussian process regression, showed enhanced generalization ability, especially in maintaining homogeneous prediction accuracy over the range. The two non-linear models generally have similar prediction performance, but showed different features in some situations, especially when the size of the training set varies. This is due to fundamental differences in fitting criteria between these models

    Electricity load demand forecasting in Portugal using least-squares support vector machines

    Get PDF
    Dissertação de mest., Engenharia Informática, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2013Electricity Load Demand (ELD) forecasting is a subject that is of interest mainly to producers and distributors and it has a great impact on the national economy. At the national scale it is not viable to store electricity and it is also difficult to estimate its consumption accurately enough in order to provide a better agreement between supply and demand and consequently less waste of energy. Thus, researchers from many areas have addressed this issue in a way to facilitate the task of power grid companies in adjusting production levels to consumption demand. Over the years, many predictive algorithms were tested and the Radial Basis Function Artificial Neural Network (RBF ANN) was up to now one of the most tested approaches with satisfactory results. The fact that the on-line adaptation is not an easy task for this approach, led demand for new ways to make the prediction, promising better results, or at least as good as those of RBF ANN, and also the ability to overcome the difficulties founded by RBF ANN in on-line adaptation. This work aims at introducing a new approach still little explored for electricity consumption prediction. Least-Squares Support Vector Machines (LS-SVMs) are a good alternative to RBF ANN and other approaches, since they have fewer parameters to adjust, hence, allowing significant decrease in the sensitivity of those machines to well-known problems associated with parameter adaptation, making the on-line model adaptation more stable over tim
    corecore