327 research outputs found

    Financial time series forecasting using twin support vector regression

    Full text link
    © 2019 Gupta et al. Financial time series forecasting is a crucial measure for improving and making more robust financial decisions throughout the world. Noisy data and non-stationarity information are the two key factors in financial time series prediction. This paper proposes twin support vector regression for financial time series prediction to deal with noisy data and nonstationary information. Various interesting financial time series datasets across a wide range of industries, such as information technology, the stock market, the banking sector, and the oil and petroleum sector, are used for numerical experiments. Further, to test the accuracy of the prediction of the time series, the root mean squared error and the standard deviation are computed, which clearly indicate the usefulness and applicability of the proposed method. The twin support vector regression is computationally faster than other standard support vector regression on the given 44 datasets

    HawkEye: Advancing Robust Regression with Bounded, Smooth, and Insensitive Loss Function

    Full text link
    Support vector regression (SVR) has garnered significant popularity over the past two decades owing to its wide range of applications across various fields. Despite its versatility, SVR encounters challenges when confronted with outliers and noise, primarily due to the use of the ε\varepsilon-insensitive loss function. To address this limitation, SVR with bounded loss functions has emerged as an appealing alternative, offering enhanced generalization performance and robustness. Notably, recent developments focus on designing bounded loss functions with smooth characteristics, facilitating the adoption of gradient-based optimization algorithms. However, it's crucial to highlight that these bounded and smooth loss functions do not possess an insensitive zone. In this paper, we address the aforementioned constraints by introducing a novel symmetric loss function named the HawkEye loss function. It is worth noting that the HawkEye loss function stands out as the first loss function in SVR literature to be bounded, smooth, and simultaneously possess an insensitive zone. Leveraging this breakthrough, we integrate the HawkEye loss function into the least squares framework of SVR and yield a new fast and robust model termed HE-LSSVR. The optimization problem inherent to HE-LSSVR is addressed by harnessing the adaptive moment estimation (Adam) algorithm, known for its adaptive learning rate and efficacy in handling large-scale problems. To our knowledge, this is the first time Adam has been employed to solve an SVR problem. To empirically validate the proposed HE-LSSVR model, we evaluate it on UCI, synthetic, and time series datasets. The experimental outcomes unequivocally reveal the superiority of the HE-LSSVR model both in terms of its remarkable generalization performance and its efficiency in training time

    Solution Path Algorithm for Twin Multi-class Support Vector Machine

    Full text link
    The twin support vector machine and its extensions have made great achievements in dealing with binary classification problems, however, which is faced with some difficulties such as model selection and solving multi-classification problems quickly. This paper is devoted to the fast regularization parameter tuning algorithm for the twin multi-class support vector machine. A new sample dataset division method is adopted and the Lagrangian multipliers are proved to be piecewise linear with respect to the regularization parameters by combining the linear equations and block matrix theory. Eight kinds of events are defined to seek for the starting event and then the solution path algorithm is designed, which greatly reduces the computational cost. In addition, only few points are combined to complete the initialization and Lagrangian multipliers are proved to be 1 as the regularization parameter tends to infinity. Simulation results based on UCI datasets show that the proposed method can achieve good classification performance with reducing the computational cost of grid search method from exponential level to the constant level

    Study on support vector machine as a classifier

    Get PDF
    SVM [1], [2] is a learning method which learns by considering data points to be in space. We studied different types of Support Vector Machine (SVM). We also observed their classification process. We conducted10-fold testing experiments on LSSVM [7], [8] (Least square Support Vector Machine) and PSVM [9] (Proximal Support Vector Machine) using standard sets of data. Finally we proposed a new algorithm NPSVM (Non-Parallel Support Vector Machine) which is reformulated from NPPC [12], [13] (Non-Parallel Plane Classifier). We have observed that the cost function of NPPC is affected by the additional constraint for Euclidean distance classification. So we implicitly normalized the weight vectors instead of the additional constraint. As a result we could generate a very good cost function. The computational complexity of NPSVM for both linear and non-linear kernel is evaluated. The results of 10-fold test using standard data sets of NPSVM are compared with the LSSVM and PSVM
    corecore