39,005 research outputs found
Recommended from our members
Modeling and simulating of reservoir operation using the artificial neural network, support vector regression, deep learning algorithm
Reservoirs and dams are vital human-built infrastructures that play essential roles in flood control, hydroelectric power generation, water supply, navigation, and other functions. The realization of those functions requires efficient reservoir operation, and the effective controls on the outflow from a reservoir or dam. Over the last decade, artificial intelligence (AI) techniques have become increasingly popular in the field of streamflow forecasts, reservoir operation planning and scheduling approaches. In this study, three AI models, namely, the backpropagation (BP) neural network, support vector regression (SVR) technique, and long short-term memory (LSTM) model, are employed to simulate reservoir operation at monthly, daily, and hourly time scales, using approximately 30 years of historical reservoir operation records. This study aims to summarize the influence of the parameter settings on model performance and to explore the applicability of the LSTM model to reservoir operation simulation. The results show the following: (1) for the BP neural network and LSTM model, the effects of the number of maximum iterations on model performance should be prioritized; for the SVR model, the simulation performance is directly related to the selection of the kernel function, and sigmoid and RBF kernel functions should be prioritized; (2) the BP neural network and SVR are suitable for the model to learn the operation rules of a reservoir from a small amount of data; and (3) the LSTM model is able to effectively reduce the time consumption and memory storage required by other AI models, and demonstrate good capability in simulating low-flow conditions and the outflow curve for the peak operation period
Recommended from our members
Three decades of the Shuffled Complex Evolution (SCE-UA) optimization algorithm: Review and applications
Recommended from our members
An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis
The classical Back-Propagation (BP) scheme with gradient-based optimization in training Artificial Neural Networks (ANNs) suffers from many drawbacks, such as the premature convergence, and the tendency of being trapped in local optimums. Therefore, as an alternative for the BP and gradient-based optimization schemes, various Evolutionary Algorithms (EAs), i.e., Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Simulated Annealing (SA), and Differential Evolution (DE), have gained popularity in the field of ANN weight training. This study applied a new efficient and effective Shuffled Complex Evolutionary Global Optimization Algorithm with Principal Component Analysis – University of California Irvine (SP-UCI) to the weight training process of a three-layer feed-forward ANN. A large-scale numerical comparison is conducted among the SP-UCI-, PSO-, GA-, SA-, and DE-based ANNs on 17 benchmark, complex, and real-world datasets. Results show that SP-UCI-based ANN outperforms other EA-based ANNs in the context of convergence and generalization. Results suggest that the SP-UCI algorithm possesses good potential in support of the weight training of ANN in real-word problems. In addition, the suitability of different kinds of EAs on training ANN is discussed. The large-scale comparison experiments conducted in this paper are fundamental references for selecting proper ANN weight training algorithms in practice
Power System Parameters Forecasting Using Hilbert-Huang Transform and Machine Learning
A novel hybrid data-driven approach is developed for forecasting power system
parameters with the goal of increasing the efficiency of short-term forecasting
studies for non-stationary time-series. The proposed approach is based on mode
decomposition and a feature analysis of initial retrospective data using the
Hilbert-Huang transform and machine learning algorithms. The random forests and
gradient boosting trees learning techniques were examined. The decision tree
techniques were used to rank the importance of variables employed in the
forecasting models. The Mean Decrease Gini index is employed as an impurity
function. The resulting hybrid forecasting models employ the radial basis
function neural network and support vector regression. Apart from introduction
and references the paper is organized as follows. The section 2 presents the
background and the review of several approaches for short-term forecasting of
power system parameters. In the third section a hybrid machine learning-based
algorithm using Hilbert-Huang transform is developed for short-term forecasting
of power system parameters. Fourth section describes the decision tree learning
algorithms used for the issue of variables importance. Finally in section six
the experimental results in the following electric power problems are
presented: active power flow forecasting, electricity price forecasting and for
the wind speed and direction forecasting
- …