320 research outputs found

    Stochastic Optimization in Econometric Models – A Comparison of GA, SA and RSG

    Get PDF
    This paper shows that, in case of an econometric model with a high sensitivity to data, using stochastic optimization algorithms is better than using classical gradient techniques. In addition, we showed that the Repetitive Stochastic Guesstimation (RSG) algorithm –invented by Charemza-is closer to Simulated Annealing (SA) than to Genetic Algorithms (GAs), so we produced hybrids between RSG and SA to study their joint behavior. The evaluation of all algorithms involved was performed on a short form of the Romanian macro model, derived from Dobrescu (1996). The subject of optimization was the model’s solution, as function of the initial values (in the first stage) and of the objective functions (in the second stage). We proved that a priori information help “elitist “ algorithms (like RSG and SA) to obtain best results; on the other hand, when one has equal believe concerning the choice among different objective functions, GA gives a straight answer. Analyzing the average related bias of the model’s solution proved the efficiency of the stochastic optimization methods presented.underground economy, Laffer curve, informal activity, fiscal policy, transitionmacroeconomic model, stochastic optimization, evolutionary algorithms, Repetitive Stochastic Guesstimation

    Advanced Methods of Power Load Forecasting

    Get PDF
    This reprint introduces advanced prediction models focused on power load forecasting. Models based on artificial intelligence and more traditional approaches are shown, demonstrating the real possibilities of use to improve prediction in this field. Models of LSTM neural networks, LSTM networks with a SESDA architecture, in even LSTM-CNN are used. On the other hand, multiple seasonal Holt-Winters models with discrete seasonality and the application of the Prophet method to demand forecasting are presented. These models are applied in different circumstances and show highly positive results. This reprint is intended for both researchers related to energy management and those related to forecasting, especially power load

    Optimal Participation of Power Generating Companies in a Deregulated Electricity Market

    Get PDF
    The function of an electric utility is to make stable electric power available to consumers in an efficient manner. This would include power generation, transmission, distribution and retail sales. Since the early nineties however, many utilities have had to change from the vertically integrated structure to a deregulated system where the services were unbundled due to a rapid demand growth and need for better economic benefits. With the unbundling of services came competition which pushed innovation and led to the improvement of efficiency. In a deregulated power system, power generators submit offers to sell energy and operating reserve in the electricity market. The market can be described more as oligopolistic with a System Operator in-charge of the power grid, matching the offers to supply with the bid in demands to determine the market clearing price for each interval. This price is what is paid to all generators. Energy is sold in the day-ahead market where offers are submitted hours prior to when it is needed. The spot energy market caters to unforeseen rise in load demand and thus commands a higher price for electrical energy than the day-ahead market. A generating company can improve its profit by using an appropriate bidding strategy. This improvement is affected by the nature of bids from competitors and uncertainty in demand. In a sealed bid auction, bids are submitted simultaneously within a timeframe and are confidential, thus a generator has no information on rivals’ bids. There have been studies on methods used by generators to build optimal offers considering competition. However, many of these studies base estimations of rivals’ behaviour on analysis with sufficient bidding history data from the market. Historical data on bidding behaviour may not be readily available in practical systems. The work reported in this thesis explores ways a generator can make security-constrained offers in different markets considering incomplete market information. It also incorporates possible uncertainty in load forecasts. The research methodology used in this thesis is based on forecasting and optimization. Forecasts of market clearing price for each market interval are calculated and used in the objective function of profit maximization to get maximum benefit at the interval. Making these forecasts includes competition into the bid process. Results show that with information on historical data available, a generator can make adequate short-term analysis on market behaviour and thus optimize its benefits for the period. This thesis provides new insights into power generators’ approach in making optimal bids to maximize market benefits

    Forecasting monthly airline passenger numbers with small datasets using feature engineering and a modified principal component analysis

    Get PDF
    In this study, a machine learning approach based on time series models, different feature engineering, feature extraction, and feature derivation is proposed to improve air passenger forecasting. Different types of datasets were created to extract new features from the core data. An experiment was undertaken with artificial neural networks to test the performance of neurons in the hidden layer, to optimise the dimensions of all layers and to obtain an optimal choice of connection weights – thus the nonlinear optimisation problem could be solved directly. A method of tuning deep learning models using H2O (which is a feature-rich, open source machine learning platform known for its R and Spark integration and its ease of use) is also proposed, where the trained network model is built from samples of selected features from the dataset in order to ensure diversity of the samples and to improve training. A successful application of deep learning requires setting numerous parameters in order to achieve greater model accuracy. The number of hidden layers and the number of neurons, are key parameters in each layer of such a network. Hyper-parameter, grid search, and random hyper-parameter approaches aid in setting these important parameters. Moreover, a new ensemble strategy is suggested that shows potential to optimise parameter settings and hence save more computational resources throughout the tuning process of the models. The main objective, besides improving the performance metric, is to obtain a distribution on some hold-out datasets that resemble the original distribution of the training data. Particular attention is focused on creating a modified version of Principal Component Analysis (PCA) using a different correlation matrix – obtained by a different correlation coefficient based on kinetic energy to derive new features. The data were collected from several airline datasets to build a deep prediction model for forecasting airline passenger numbers. Preliminary experiments show that fine-tuning provides an efficient approach for tuning the ultimate number of hidden layers and the number of neurons in each layer when compared with the grid search method. Similarly, the results show that the modified version of PCA is more effective in data dimension reduction, classes reparability, and classification accuracy than using traditional PCA.</div
    • …
    corecore