36 research outputs found

    Forecasting the Behavior of Gas Furnace Multivariate Time Series Using Ridge Polynomial Based Neural Network Models

    Get PDF
    In this paper, a new application of ridge polynomial based neural network models in multivariate time series forecasting is presented. The existing ridge polynomial based neural network models can be grouped into two groups. Group A consists of models that use only autoregressive inputs, whereas Group B consists of models that use autoregressive and moving-average (i.e., error feedback) inputs. The well-known Box-Jenkins gas furnace multivariate time series was used in the forecasting comparison between the two groups. Simulation results show that the models in Group B achieve significant forecasting performance as compared to the models in Group A. Therefore, the Box-Jenkins gas furnace data can be modeled better using neural networks when error feedback is used

    The performance of soft computing techniques on content-based SMS spam filtering

    Get PDF
    Content-based filtering is one of the most widely used methods to combat SMS (Short Message Service) spam. This method represents SMS text messages by a set of selected features which are extracted from data sets. Most of the available data sets have imbalanced class distribution problem. However, not much attention has been paid to handle this problem which affect the characteristics and size of selected features and cause undesired performance. Soft computing approaches have been applied successfully in content-based spam filtering. In order to enhance soft computing performance, suitable feature subset should be selected. Therefore, this research investigates how well suited three soft computing techniques: Fuzzy Similarity, Artificial Neural Network and Support Vector Machines (SVM) are for content-based SMS spam filtering using an appropriate size of features which are selected by the Gini Index metric as it has the ability to extract suitable features from imbalanced data sets. The data sets used in this research were taken from three sources: UCI repository, Dublin Institute of Technology (DIT) and British English SMS. The performance of each of the technique was compared in terms of True Positive Rate against False Positive Rate, F1 score and Matthews Correlation Coefficient. The results showed that SVM with 150 features outperformed the other techniques in all the comparison measures. The average time needed to classify an SMS text message is a fraction of a millisecond. Another test using NUS SMS corpus was conducted in order to validate the SVM classifier with 150 features. The results again proved the efficiency of the SVM classifier with 150 features for SMS spam filtering with an accuracy of about 99.2%

    Recurrent error-based ridge polynomial neural networks for time series forecasting

    Get PDF
    Time series forecasting has attracted much attention due to its impact on many practical applications. Neural networks (NNs) have been attracting widespread interest as a promising tool for time series forecasting. The majority of NNs employ only autoregressive (AR) inputs (i.e., lagged time series values) when forecasting time series. Moving-average (MA) inputs (i.e., errors) however have not adequately considered. The use of MA inputs, which can be done by feeding back forecasting errors as extra network inputs, alongside AR inputs help to produce more accurate forecasts. Among numerous existing NNs architectures, higher order neural networks (HONNs), which have a single layer of learnable weights, were considered in this research work as they have demonstrated an ability to deal with time series forecasting and have an simple architecture. Based on two HONNs models, namely the feedforward ridge polynomial neural network (RPNN) and the recurrent dynamic ridge polynomial neural network (DRPNN), two recurrent error-based models were proposed. These models were called the ridge polynomial neural network with error feedback (RPNN-EF) and the ridge polynomial neural network with error-output feedbacks (RPNN-EOF). Extensive simulations covering ten time series were performed. Besides RPNN and DRPNN, a pi-sigma neural network and a Jordan pi-sigma neural network were used for comparison. Simulation results showed that introducing error feedback to the models lead to significant forecasting performance improvements. Furthermore, it was found that the proposed models outperformed many state-of-the-art models. It was concluded that the proposed models have the capability to efficiently forecast time series and that practitioners could benefit from using these forecasting models

    Improving the Accuracy of COCOMO II Effort Estimation Based on Neural Network with Hyperbolic Tangent Activation Function

    Get PDF
    Constructive Cost Model II (COCOMO II) is one of the best-known software cost estimation model. The estimation of the effort in COCOMO II depends on several attributes that categorized by software size (SS), scale factors (SFs) and effort multipliers (EMs). However, provide accurate estimation is still unsatisfactory in software management. Neural Network (NN) is one of several approaches developed to improve the accuracy of COCOMO II. From the literature, they found that the learning using sigmoid function has always mismatched and ill behaved. Thus, this research proposes Hyperbolic Tangent activation function (Tanh) to use in the hidden layer of the NN. Two different architectures of NN with COCOMO (the basic COCOMO-NN and the modified COCOMO-NN) are used. Back-propagation learning algorithm is applied to adjust the COCOMO II effort estimation parameters. NASA93 dataset is used in the experiments. Magnitude of Relative Error (MRE) and Mean Magnitude of Relative Error (MMRE) are used as evaluation criteria. This research attempts to compare the performance of Tanh activation function with several activation functions, namely Uni-polar sigmoid, Bi-polar sigmoid, Gaussian and Softsign activation functions. The experiment results indicate that the Tanh with the modified COCOMO-NN architecture produce better result comparing to other activation functions

    Mackey-Glass time series

    No full text
    Mackey-Glass time series<div><br></div

    Ridge Polynomial Neural Network with Error Feedback for Time Series Forecasting.

    No full text
    Time series forecasting has gained much attention due to its many practical applications. Higher-order neural network with recurrent feedback is a powerful technique that has been used successfully for time series forecasting. It maintains fast learning and the ability to learn the dynamics of the time series over time. Network output feedback is the most common recurrent feedback for many recurrent neural network models. However, not much attention has been paid to the use of network error feedback instead of network output feedback. In this study, we propose a novel model, called Ridge Polynomial Neural Network with Error Feedback (RPNN-EF) that incorporates higher order terms, recurrence and error feedback. To evaluate the performance of RPNN-EF, we used four univariate time series with different forecasting horizons, namely star brightness, monthly smoothed sunspot numbers, daily Euro/Dollar exchange rate, and Mackey-Glass time-delay differential equation. We compared the forecasting performance of RPNN-EF with the ordinary Ridge Polynomial Neural Network (RPNN) and the Dynamic Ridge Polynomial Neural Network (DRPNN). Simulation results showed an average 23.34% improvement in Root Mean Square Error (RMSE) with respect to RPNN and an average 10.74% improvement with respect to DRPNN. That means that using network errors during training helps enhance the overall forecasting performance for the network

    Ridge Polynomial Neural Network with Error Feedback.

    No full text
    <p>PSNN stands for Pi-Sigma Neural Network, <i>d</i>(<i>t</i> + 1) is the desired output at time <i>t</i> + 1 and <i>Z</i><sup>−1</sup> denotes the time delay operator.</p
    corecore