4,695 research outputs found

    Metaheuristic design of feedforward neural networks: a review of two decades of research

    Get PDF
    Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era

    Evolutionary optimization of sparsely connected and time-lagged neural networks for time series forecasting

    Get PDF
    Time Series Forecasting (TSF) is an important tool to support decision mak- ing (e.g., planning production resources). Artificial Neural Networks (ANN) are innate candidates for TSF due to advantages such as nonlinear learn- ing and noise tolerance. However, the search for the best model is a complex task that highly affects the forecasting performance. In this work, we propose two novel Evolutionary Artificial Neural Networks (EANN) approaches for TSF based on an Estimation Distribution Algorithm (EDA) search engine. The first new approach consist of Sparsely connected Evolutionary ANN (SEANN), which evolves more flexible ANN structures to perform multi-step ahead forecasts. The second one, consists of an automatic Time lag feature selection EANN (TEANN) approach that evolves not only ANN parameters (e.g., input and hidden nodes, training parameters) but also which set of time lags are fed into the forecasting model. Several experiments were held, using a set of six time series, from different real-world domains. Also, two error metrics (i.e., Mean Squared Error and Symmetric Mean Absolute Per- centage Error) were analyzed. The two EANN approaches were compared against a base EANN (with no ANN structure or time lag optimization) and four other methods (Autoregressive Integrated Moving Average method, Random Forest, Echo State Network and Support Vector Machine). Overall, the proposed SEANN and TEANN methods obtained the best forecasting results. Moreover, they favor simpler neural network models, thus requiring less computational effort when compared with the base EANN.The research reported here has been supported by the Spanish Ministry of Science and Innovation under project TRA2010-21371-C03-03 and FCT - Fundacao para a Ciencia e Tecnologia within the Project Scope PEst- OE/EEI/UI0319/2014. The authors want to thank specially Martin Stepnicka and Lenka Vavrickova for all their help. The authors also want to thank Ramon Sagarna for introducing the subject of EDA

    Ensemble Models in Forecasting Financial Markets

    Get PDF

    Global and decomposition evolutionary support vector machine approaches for time series forecasting

    Get PDF
    Multi-step ahead Time Series Forecasting (TSF) is a key tool for support- ing tactical decisions (e.g., planning resources). Recently, the support vector machine emerged as a natural solution for TSF due to its nonlinear learning capabilities. This paper presents two novel Evolutionary Support Vector Machine (ESVM) methods for multi-step TSF. Both methods are based on an Estimation Distribution Algorithm (EDA) search engine that automatically performs a simultaneous variable (number of inputs) and model (hyperparameters) selection. The Global ESVM (GESVM) uses all past patterns to fit the support vector machine, while the Decomposition ESVM (DESVM) separates the series into trended and stationary effects, using a distinct ESVM to forecast each effect and then summing both predictions into a sin- gle response. Several experiments were held, using six time series. The proposed approaches were analyzed under two criteria and compared against a recent Evolu- tionary Artificial Neural Network (EANN) and two classical forecasting methods, Holt-Winters and ARIMA. Overall, the DESVM and GESVM obtained competitive and high quality results. Furthermore, both ESVM approaches consume much less computational effort when compared with EANN.The authors wish to thank Ramon Sagarna for introducing the subject of EDA. The work of P. Cortez was supported by FEDER (program COMPETE and FCT) under project FCOMP-01-0124-FEDER-022674

    Enhanced artificial bee colony-least squares support vector machines algorithm for time series prediction

    Get PDF
    Over the past decades, the Least Squares Support Vector Machines (LSSVM) has been widely utilized in prediction task of various application domains. Nevertheless, existing literature showed that the capability of LSSVM is highly dependent on the value of its hyper-parameters, namely regularization parameter and kernel parameter, where this would greatly affect the generalization of LSSVM in prediction task. This study proposed a hybrid algorithm, based on Artificial Bee Colony (ABC) and LSSVM, that consists of three algorithms; ABC-LSSVM, lvABC-LSSVM and cmABC-LSSVM. The lvABC algorithm is introduced to overcome the local optima problem by enriching the searching behaviour using Levy mutation. On the other hand, the cmABC algorithm that incorporates conventional mutation addresses the over- fitting or under-fitting problem. The combination of lvABC and cmABC algorithm, which is later introduced as Enhanced Artificial Bee Colony–Least Squares Support Vector Machine (eABC-LSSVM), is realized in prediction of non renewable natural resources commodity price. Upon the completion of data collection and data pre processing, the eABC-LSSVM algorithm is designed and developed. The predictability of eABC-LSSVM is measured based on five statistical metrics which include Mean Absolute Percentage Error (MAPE), prediction accuracy, symmetric MAPE (sMAPE), Root Mean Square Percentage Error (RMSPE) and Theils’ U. Results showed that the eABC-LSSVM possess lower prediction error rate as compared to eight hybridization models of LSSVM and Evolutionary Computation (EC) algorithms. In addition, the proposed algorithm is compared to single prediction techniques, namely, Support Vector Machines (SVM) and Back Propagation Neural Network (BPNN). In general, the eABC-LSSVM produced more than 90% prediction accuracy. This indicates that the proposed eABC-LSSVM is capable of solving optimization problem, specifically in the prediction task. The eABC-LSSVM is hoped to be useful to investors and commodities traders in planning their investment and projecting their profit

    Forecasting currency exchange rate time series with fireworks-algorithm-based higher order neural network with special attention to training data enrichment

    Get PDF
    Exchange rates are highly fluctuating by nature, thus difficult to forecast. Artificial neural networks (ANN) have proved to be better than statistical methods. Inadequate training data may lead the model to reach suboptimal solution resulting, poor accuracy as ANN-based forecasts are data driven. To enhance forecasting accuracy, we suggests a method of enriching training dataset through exploring and incorporating of virtual data points (VDPs) by an evolutionary method called as fireworks algorithm trained functional link artificial neural network (FWA-FLN). The model maintains the correlation between the current and past data, especially at the oscillation point on the time series. The exploring of a VDP and forecast of the succeeding term go consecutively by the FWA-FLN. Real exchange rate time series are used to train and validate the proposed model. The efficiency of the proposed technique is related to other models trained similarly and produces far better prediction accuracy

    The Stock Exchange Prediction using Machine Learning Techniques: A Comprehensive and Systematic Literature Review

    Get PDF
    This literature review identifies and analyzes research topic trends, types of data sets, learning algorithm, methods improvements, and frameworks used in stock exchange prediction. A total of 81 studies were investigated, which were published regarding stock predictions in the period January 2015 to June 2020 which took into account the inclusion and exclusion criteria. The literature review methodology is carried out in three major phases: review planning, implementation, and report preparation, in nine steps from defining systematic review requirements to presentation of results. Estimation or regression, clustering, association, classification, and preprocessing analysis of data sets are the five main focuses revealed in the main study of stock prediction research. The classification method gets a share of 35.80% from related studies, the estimation method is 56.79%, data analytics is 4.94%, the rest is clustering and association is 1.23%. Furthermore, the use of the technical indicator data set is 74.07%, the rest are combinations of datasets. To develop a stock prediction model 48 different methods have been applied, 9 of the most widely applied methods were identified. The best method in terms of accuracy and also small error rate such as SVM, DNN, CNN, RNN, LSTM, bagging ensembles such as RF, boosting ensembles such as XGBoost, ensemble majority vote and the meta-learner approach is ensemble Stacking. Several techniques are proposed to improve prediction accuracy by combining several methods, using boosting algorithms, adding feature selection and using parameter and hyper-parameter optimization
    • …
    corecore