58,342 research outputs found

    Time Series Prediction with Recurrent Neural Networks using a Hybrid PSO-EA Algorithm

    Get PDF
    To predict the 100 missing values from the time series consisting of 5000 data given for the IJCNN 2004 time series prediction competition, we applied an architecture which automates the design of recurrent neural networks using a new evolutionary learning algorithm. This new evolutionary learning algorithm is based on a hybrid of particle swarm optimization (PSO) and evolutionary algorithm (EA). By combining the searching abilities of these two global optimization methods, the evolution of individuals is no longer restricted to be in the same generation, and better performed individuals may produce offspring to replace those with poor performance. The novel algorithm is then applied to the recurrent neural network for the time series prediction. The experimental results show that our approach gives good performance in predicting the missing values from the time series

    Predicting wind energy generation with recurrent neural networks

    Get PDF
    Decarbonizing the energy supply requires extensive use of renewable generation. Their intermittent nature requires to obtain accurate forecasts of future generation, at short, mid and long term. Wind Energy generation prediction is based on the ability to forecast wind intensity. This problem has been approached using two families of methods one based on weather forecasting input (Numerical Weather Model Prediction) and the other based on past observations (time series forecasting). This work deals with the application of Deep Learning to wind time series. Wind Time series are non-linear and non-stationary, making their forecasting very challenging. Deep neural networks have shown their success recently for problems involving sequences with non-linear behavior. In this work, we perform experiments comparing the capability of different neural network architectures for multi-step forecasting in a 12 h ahead prediction. For the Time Series input we used the US National Renewable Energy Laboratory’s WIND Dataset [3], (the largest available wind and energy dataset with over 120,000 physical wind sites), this dataset is evenly spread across all the North America geography which has allowed us to obtain conclusions on the relationship between physical site complexity and forecast accuracy. In the preliminary results of this work it can be seen a relationship between the error (measured as R2R2 ) and the complexity of the terrain, and a better accuracy score by some Recurrent Neural Network Architectures.Peer ReviewedPostprint (author's final draft

    STING: Self-attention based Time-series Imputation Networks using GAN

    Full text link
    Time series data are ubiquitous in real-world applications. However, one of the most common problems is that the time series data could have missing values by the inherent nature of the data collection process. So imputing missing values from multivariate (correlated) time series data is imperative to improve a prediction performance while making an accurate data-driven decision. Conventional works for imputation simply delete missing values or fill them based on mean/zero. Although recent works based on deep neural networks have shown remarkable results, they still have a limitation to capture the complex generation process of the multivariate time series. In this paper, we propose a novel imputation method for multivariate time series data, called STING (Self-attention based Time-series Imputation Networks using GAN). We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series. In addition, we introduce a novel attention mechanism to capture the weighted correlations of the whole sequence and avoid potential bias brought by unrelated ones. Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy as well as downstream tasks with the imputed values therein.Comment: 10 pages. This paper is an accepted version by ICDM'21. The published version is https://ieeexplore.ieee.org/abstract/document/967918

    Generating Energy Data for Machine Learning with Recurrent Generative Adversarial Networks

    Get PDF
    The smart grid employs computing and communication technologies to embed intelligence into the power grid and, consequently, make the grid more efficient. Machine learning (ML) has been applied for tasks that are important for smart grid operation including energy consumption and generation forecasting, anomaly detection, and state estimation. These ML solutions commonly require sufficient historical data; however, this data is often not readily available because of reasons such as data collection costs and concerns regarding security and privacy. This paper introduces a recurrent generative adversarial network (R-GAN) for generating realistic energy consumption data by learning from real data. Generativea adversarial networks (GANs) have been mostly used for image tasks (e.g., image generation, super-resolution), but here they are used with time series data. Convolutional neural networks (CNNs) from image GANs are replaced with recurrent neural networks (RNNs) because of RNN’s ability to capture temporal dependencies. To improve training stability and increase quality of generated data,Wasserstein GANs (WGANs) and Metropolis-Hastings GAN (MH-GAN) approaches were applied. The accuracy is further improved by adding features created with ARIMA and Fourier transform. Experiments demonstrate that data generated by R-GAN can be used for training energy forecasting models

    StyleTime: Style Transfer for Synthetic Time Series Generation

    Full text link
    Neural style transfer is a powerful computer vision technique that can incorporate the artistic "style" of one image to the "content" of another. The underlying theory behind the approach relies on the assumption that the style of an image is represented by the Gram matrix of its features, which is typically extracted from pre-trained convolutional neural networks (e.g., VGG-19). This idea does not straightforwardly extend to time series stylization since notions of style for two-dimensional images are not analogous to notions of style for one-dimensional time series. In this work, a novel formulation of time series style transfer is proposed for the purpose of synthetic data generation and enhancement. We introduce the concept of stylized features for time series, which is directly related to the time series realism properties, and propose a novel stylization algorithm, called StyleTime, that uses explicit feature extraction techniques to combine the underlying content (trend) of one time series with the style (distributional properties) of another. Further, we discuss evaluation metrics, and compare our work to existing state-of-the-art time series generation and augmentation schemes. To validate the effectiveness of our methods, we use stylized synthetic data as a means for data augmentation to improve the performance of recurrent neural network models on several forecasting tasks

    Synaptic state matching: a dynamical architecture for predictive internal representation and feature perception

    Get PDF
    Here we consider the possibility that a fundamental function of sensory cortex is the generation of an internal simulation of sensory environment in real-time. A logical elaboration of this idea leads to a dynamical neural architecture that oscillates between two fundamental network states, one driven by external input, and the other by recurrent synaptic drive in the absence of sensory input. Synaptic strength is modified by a proposed synaptic state matching (SSM) process that ensures equivalence of spike statistics between the two network states. Remarkably, SSM, operating locally at individual synapses, generates accurate and stable network-level predictive internal representations, enabling pattern completion and unsupervised feature detection from noisy sensory input. SSM is a biologically plausible substrate for learning and memory because it brings together sequence learning, feature detection, synaptic homeostasis, and network oscillations under a single parsimonious computational framework. Beyond its utility as a potential model of cortical computation, artificial networks based on this principle have remarkable capacity for internalizing dynamical systems, making them useful in a variety of application domains including time-series prediction and machine intelligence

    Innovative Second-Generation Wavelets Construction With Recurrent Neural Networks for Solar Radiation Forecasting

    Full text link
    Solar radiation prediction is an important challenge for the electrical engineer because it is used to estimate the power developed by commercial photovoltaic modules. This paper deals with the problem of solar radiation prediction based on observed meteorological data. A 2-day forecast is obtained by using novel wavelet recurrent neural networks (WRNNs). In fact, these WRNNS are used to exploit the correlation between solar radiation and timescale-related variations of wind speed, humidity, and temperature. The input to the selected WRNN is provided by timescale-related bands of wavelet coefficients obtained from meteorological time series. The experimental setup available at the University of Catania, Italy, provided this information. The novelty of this approach is that the proposed WRNN performs the prediction in the wavelet domain and, in addition, also performs the inverse wavelet transform, giving the predicted signal as output. The obtained simulation results show a very low root-mean-square error compared to the results of the solar radiation prediction approaches obtained by hybrid neural networks reported in the recent literature
    • …
    corecore