72,435 research outputs found

    An epidemic model with viral mutations and vaccine interventions

    Get PDF
    Prediction is a means of forecasting a future value by using and analyzing historical or current data. A popular neural network architecture used as a prediction model is the Recurrent Neural Network (RNN) because of its wide application and very high generalization performance. This study aims to improve the RNN prediction model’s accuracy using k-means grouping and PCA dimension reduction methods by comparing the five distance functions. Data were processed using Python software and the results obtained from the PCA calculation yielded three new variables or principal components out of the five examined. This study used an optimized RNN prediction model with k-means clustering by comparing the Euclidean, Manhattan, Canberra, Average, and Chebyshev distance functions as a measure of data grouping similarity to avoid being trapped in the local optimal solution. In addition, PCA dimension reduction was also used in facilitating multivariate data analysis. The k-means grouping showed that the most optimal distance is the average function producing a DBI value of 0.60855 and converging at the 9th iteration. The RNN prediction model results evaluated based on the number of RMSE errors which was 0.83, while that of MAPE was 8.62%. Therefore, it was concluded that the K-means and PCA methods generated a more optimal prediction model for the RNN metho

    Developing an optimized recurrent neural network model for air quality prediction using K-Means clustering and PCS dimension reduction

    Get PDF
    Prediction is a means of forecasting a future value by using and analyzing historical or current data. A popular neural network architecture used as a prediction model is the Recurrent Neural Network (RNN) because of its wide application and very high generalization performance. This study aims to improve the RNN prediction model’s accuracy using k-means grouping and PCA dimension reduction methods by comparing the five distance functions. Data were processed using Python software and the results obtained from the PCA calculation yielded three new variables or principal components out of the five examined. This study used an optimized RNN prediction model with k-means clustering by comparing the Euclidean, Manhattan, Canberra, Average, and Chebyshev distance functions as a measure of data grouping similarity to avoid being trapped in the local optimal solution. In addition, PCA dimension reduction was also used in facilitating multivariate data analysis. The k-means grouping showed that the most optimal distance is the average function producing a DBI value of 0.60855 and converging at the 9th iteration. The RNN prediction model results evaluated based on the number of RMSE errors which was 0.83, while that of MAPE was 8.62%. Therefore, it was concluded that the K-means and PCA methods generated a more optimal prediction model for the RNN method

    Representation of Functional Data in Neural Networks

    Get PDF
    Functional Data Analysis (FDA) is an extension of traditional data analysis to functional data, for example spectra, temporal series, spatio-temporal images, gesture recognition data, etc. Functional data are rarely known in practice; usually a regular or irregular sampling is known. For this reason, some processing is needed in order to benefit from the smooth character of functional data in the analysis methods. This paper shows how to extend the Radial-Basis Function Networks (RBFN) and Multi-Layer Perceptron (MLP) models to functional data inputs, in particular when the latter are known through lists of input-output pairs. Various possibilities for functional processing are discussed, including the projection on smooth bases, Functional Principal Component Analysis, functional centering and reduction, and the use of differential operators. It is shown how to incorporate these functional processing into the RBFN and MLP models. The functional approach is illustrated on a benchmark of spectrometric data analysis.Comment: Also available online from: http://www.sciencedirect.com/science/journal/0925231

    Dimensionality Reduction Mappings

    Get PDF
    A wealth of powerful dimensionality reduction methods has been established which can be used for data visualization and preprocessing. These are accompanied by formal evaluation schemes, which allow a quantitative evaluation along general principles and which even lead to further visualization schemes based on these objectives. Most methods, however, provide a mapping of a priorly given finite set of points only, requiring additional steps for out-of-sample extensions. We propose a general view on dimensionality reduction based on the concept of cost functions, and, based on this general principle, extend dimensionality reduction to explicit mappings of the data manifold. This offers simple out-of-sample extensions. Further, it opens a way towards a theory of data visualization taking the perspective of its generalization ability to new data points. We demonstrate the approach based on a simple global linear mapping as well as prototype-based local linear mappings.
    corecore