7,647 research outputs found

    An Integrated Multi-Time-Scale Modeling for Solar Irradiance Forecasting Using Deep Learning

    Full text link
    For short-term solar irradiance forecasting, the traditional point forecasting methods are rendered less useful due to the non-stationary characteristic of solar power. The amount of operating reserves required to maintain reliable operation of the electric grid rises due to the variability of solar energy. The higher the uncertainty in the generation, the greater the operating-reserve requirements, which translates to an increased cost of operation. In this research work, we propose a unified architecture for multi-time-scale predictions for intra-day solar irradiance forecasting using recurrent neural networks (RNN) and long-short-term memory networks (LSTMs). This paper also lays out a framework for extending this modeling approach to intra-hour forecasting horizons thus, making it a multi-time-horizon forecasting approach, capable of predicting intra-hour as well as intra-day solar irradiance. We develop an end-to-end pipeline to effectuate the proposed architecture. The performance of the prediction model is tested and validated by the methodical implementation. The robustness of the approach is demonstrated with case studies conducted for geographically scattered sites across the United States. The predictions demonstrate that our proposed unified architecture-based approach is effective for multi-time-scale solar forecasts and achieves a lower root-mean-square prediction error when benchmarked against the best-performing methods documented in the literature that use separate models for each time-scale during the day. Our proposed method results in a 71.5% reduction in the mean RMSE averaged across all the test sites compared to the ML-based best-performing method reported in the literature. Additionally, the proposed method enables multi-time-horizon forecasts with real-time inputs, which have a significant potential for practical industry applications in the evolving grid.Comment: 19 pages, 12 figures, 3 tables, under review for journal submissio

    A General Spatio-Temporal Clustering-Based Non-local Formulation for Multiscale Modeling of Compartmentalized Reservoirs

    Full text link
    Representing the reservoir as a network of discrete compartments with neighbor and non-neighbor connections is a fast, yet accurate method for analyzing oil and gas reservoirs. Automatic and rapid detection of coarse-scale compartments with distinct static and dynamic properties is an integral part of such high-level reservoir analysis. In this work, we present a hybrid framework specific to reservoir analysis for an automatic detection of clusters in space using spatial and temporal field data, coupled with a physics-based multiscale modeling approach. In this work a novel hybrid approach is presented in which we couple a physics-based non-local modeling framework with data-driven clustering techniques to provide a fast and accurate multiscale modeling of compartmentalized reservoirs. This research also adds to the literature by presenting a comprehensive work on spatio-temporal clustering for reservoir studies applications that well considers the clustering complexities, the intrinsic sparse and noisy nature of the data, and the interpretability of the outcome. Keywords: Artificial Intelligence; Machine Learning; Spatio-Temporal Clustering; Physics-Based Data-Driven Formulation; Multiscale Modelin

    Kernel methods for short-term spatio-temporal wind prediction

    Get PDF
    Two nonlinear methods for producing short-term spatio-temporal wind speed forecast are presented. From the relatively new class of kernel methods, a kernel least mean squares algorithm and kernel recursive least squares algorithm are introduced and used to produce 1 to 6 hour-ahead predictions of wind speed at six locations in the Netherlands. The performance of the proposed methods are compared to their linear equivalents, as well as the autoregressive, vector autoregressive and persistence time series models. The kernel recursive least squares algorithm is shown to offer significant improvement over all benchmarks, particularly for longer forecast horizons. Both proposed algorithms exhibit desirable numerical properties and are ripe for further development

    A Comprehensive Survey of Deep Learning in Remote Sensing: Theories, Tools and Challenges for the Community

    Full text link
    In recent years, deep learning (DL), a re-branding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, natural language processing, etc. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV; e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should be aware of, if not at the leading edge of, of advancements like DL. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as it relates to (i) inadequate data sets, (ii) human-understandable solutions for modelling physical phenomena, (iii) Big Data, (iv) non-traditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.Comment: 64 pages, 411 references. To appear in Journal of Applied Remote Sensin

    Bayesian Learning and Predictability in a Stochastic Nonlinear Dynamical Model

    Get PDF
    Bayesian inference methods are applied within a Bayesian hierarchical modelling framework to the problems of joint state and parameter estimation, and of state forecasting. We explore and demonstrate the ideas in the context of a simple nonlinear marine biogeochemical model. A novel approach is proposed to the formulation of the stochastic process model, in which ecophysiological properties of plankton communities are represented by autoregressive stochastic processes. This approach captures the effects of changes in plankton communities over time, and it allows the incorporation of literature metadata on individual species into prior distributions for process model parameters. The approach is applied to a case study at Ocean Station Papa, using Particle Markov chain Monte Carlo computational techniques. The results suggest that, by drawing on objective prior information, it is possible to extract useful information about model state and a subset of parameters, and even to make useful long-term forecasts, based on sparse and noisy observations

    Data mining as a tool for environmental scientists

    Get PDF
    Over recent years a huge library of data mining algorithms has been developed to tackle a variety of problems in fields such as medical imaging and network traffic analysis. Many of these techniques are far more flexible than more classical modelling approaches and could be usefully applied to data-rich environmental problems. Certain techniques such as Artificial Neural Networks, Clustering, Case-Based Reasoning and more recently Bayesian Decision Networks have found application in environmental modelling while other methods, for example classification and association rule extraction, have not yet been taken up on any wide scale. We propose that these and other data mining techniques could be usefully applied to difficult problems in the field. This paper introduces several data mining concepts and briefly discusses their application to environmental modelling, where data may be sparse, incomplete, or heterogenous

    A Machine Learning Approach for Improving Near-Real-Time Satellite-Based Rainfall Estimates by Integrating Soil Moisture

    Get PDF
    Near-real-time (NRT) satellite-based rainfall estimates (SREs) are a viable option for flood/drought monitoring. However, SREs have often been associated with complex and nonlinear errors. One way to enhance the quality of SREs is to use soil moisture information. Few studies have indicated that soil moisture information can be used to improve the quality of SREs. Nowadays, satellite-based soil moisture products are becoming available at desired spatial and temporal resolutions on an NRT basis. Hence, this study proposes an integrated approach to improve NRT SRE accuracy by combining it with NRT soil moisture through a nonlinear support vector machine-based regression (SVR) model. To test this novel approach, Ashti catchment, a sub-basin of Godavari river basin, India, is chosen. Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA)-based NRT SRE 3B42RT and Advanced Scatterometer-derived NRT soil moisture are considered in the present study. The performance of the 3B42RT and the corrected product are assessed using different statistical measures such as correlation coeffcient (CC), bias, and root mean square error (RMSE), for the monsoon seasons of 2012–2015. A detailed spatial analysis of these measures and their variability across different rainfall intensity classes are also presented. Overall, the results revealed significant improvement in the corrected product compared to 3B42RT (except CC) across the catchment. Particularly, for light and moderate rainfall classes, the corrected product showed the highest improvement (except CC). On the other hand, the corrected product showed limited performance for the heavy rainfall class. These results demonstrate that the proposed approach has potential to enhance the quality of NRT SRE through the use of NRT satellite-based soil moisture estimates
    corecore