135 research outputs found

    Long-Term Load Forecasting Considering Volatility Using Multiplicative Error Model

    Full text link
    Long-term load forecasting plays a vital role for utilities and planners in terms of grid development and expansion planning. An overestimate of long-term electricity load will result in substantial wasted investment in the construction of excess power facilities, while an underestimate of future load will result in insufficient generation and unmet demand. This paper presents first-of-its-kind approach to use multiplicative error model (MEM) in forecasting load for long-term horizon. MEM originates from the structure of autoregressive conditional heteroscedasticity (ARCH) model where conditional variance is dynamically parameterized and it multiplicatively interacts with an innovation term of time-series. Historical load data, accessed from a U.S. regional transmission operator, and recession data for years 1993-2016 is used in this study. The superiority of considering volatility is proven by out-of-sample forecast results as well as directional accuracy during the great economic recession of 2008. To incorporate future volatility, backtesting of MEM model is performed. Two performance indicators used to assess the proposed model are mean absolute percentage error (for both in-sample model fit and out-of-sample forecasts) and directional accuracy.Comment: 19 pages, 11 figures, 3 table

    Count data time series models and their applications

    Get PDF
    “Due to fast developments of advanced sensors, count data sets have become ubiquitous in many fields. Modeling and forecasting such time series have generated great interest. Modeling can shed light on the behavior of the count series and to see how they are related to other factors such as the environmental conditions under which the data are generated. In this research, three approaches to modeling such count data are proposed. First, a periodic autoregressive conditional Poisson (PACP) model is proposed as a natural generalization of the autoregressive conditional Poisson (ACP) model. By allowing for cyclical variations in the parameters of the model, it provides a way to explain the periodicity inherent in many count data series. For example, in epidemiology the prevalence of a disease may depend on the season. The autoregressive conditional Poisson hidden Markov model (ACP-HMM) is developed to deal with count data time series whose mean, conditional on the past, is a function of previous observations, with this relationship possibly determined by an unobserved process that switches its state or regime as time progresses. This model, in a sense, is the combination of the discrete version of the autoregressive conditional heteroscedastic (ARCH) formulation and the Poisson hidden Markov model. Both the above models address the frequently present serial correlation and the clustering of high or low counts observed in time series of count data, while at the same time allowing the underlying data generating mechanism to change cyclically or according to a hidden Markov process. Applications to empirical data sets show that these models provide a better fit than the standard ACP models. In addition to the above models, a modification of a zero-inflated Poisson model is used to analyze activity counts of the fruit fly. The model captures the dynamic structure of activity patterns and the fly\u27s propensity to sleep. The obtained results when fed to a convolutional neural network provides the possibility of building a predictive model to identify fruit flies with short and long lifespans”--Abstract, page iv

    Time Series Analysis

    Get PDF
    We provide a concise overview of time series analysis in the time and frequency domains, with lots of references for further reading.time series analysis, time domain, frequency domain

    Time Series Analysis

    Get PDF
    We provide a concise overview of time series analysis in the time and frequency domains, with lots of references for further reading.time series analysis, time domain, frequency domain, Research Methods/ Statistical Methods,

    Sequential Predictive Conformal Inference for Time Series

    Full text link
    We present a new distribution-free conformal prediction algorithm for sequential data (e.g., time series), called the \textit{sequential predictive conformal inference} (\texttt{SPCI}). We specifically account for the nature that time series data are non-exchangeable, and thus many existing conformal prediction algorithms are not applicable. The main idea is to adaptively re-estimate the conditional quantile of non-conformity scores (e.g., prediction residuals), upon exploiting the temporal dependence among them. More precisely, we cast the problem of conformal prediction interval as predicting the quantile of a future residual, given a user-specified point prediction algorithm. Theoretically, we establish asymptotic valid conditional coverage upon extending consistency analyses in quantile regression. Using simulation and real-data experiments, we demonstrate a significant reduction in interval width of \texttt{SPCI} compared to other existing methods under the desired empirical coverage

    The long memory behaviour of stock market volatility: evidence from the PIIGS countries

    Get PDF
    JEL Classification System: G15; C13In this study we examine the long memory behaviour of stock market volatility of the PIIGS major indices: PSI 20, FTSE MIB, ISEQ, FTSE/ATHEX and IBEX 35. In order to conduct our analyses we apply two FIGARCH-type models, one derived by Baillie, Bollerslev and Mikkelsen (1996) and another one developed by Chung’s (1999). In addition the Local Whittle estimator is also computed. A data set comprising the daily closing prices of the PIIGS’ major stock market indices spanning from 1st January 1998 to 8th March 2013 is used. The results suggest that, irrespective of the FIGARCH model adopted, there is evidence of long memory in stock market volatility. However, the Local Whittle Estimator reveals that the data generating process is a combination of long memory and jumps/structural breaks. Therefore, this feature of the data has to be taken into account when constructing models for volatility prediction.Neste estudo examinamos o comportamento de longa memĂłria na volatilidade dos principais Ă­ndices de mercado dos PIIGS: PSI20, FTSE MIB, ISEQ, FTSE/ATHEX e IBEX 35. Para realizar a nossa anĂĄlise aplicĂĄmos dois modelos do tipo FIGARCH, um derivado por Baillie, Bollerslev e Mikkelsen (1996) e outro desenvolvido por Chung (1999). Adicionalmente, o Local Whittle Estimator foi tambĂ©m estimado. Um conjunto de dados dos principais Ă­ndices de mercado de acçÔes dos PIIGS que inclui os preços de fecho diĂĄrios desde 1 de Janeiro de 1998 atĂ© 8 de Março de 2013 foi utilizado. Os resultados sugerem que, independentemente do modelo FIGARCH adoptado existem evidĂȘncias de longa memĂłria na volatilidade do mercado. No entanto, o Local Whittle Estimator revela que o processo de criação de dados Ă© uma combinação de longa memĂłria e saltos/quebras estruturais. Assim sendo, esta caracterĂ­stica dos dados tem de ser tida em conta na construção de modelos de previsĂŁo de volatilidade

    Development of Neurofuzzy Architectures for Electricity Price Forecasting

    Get PDF
    In 20th century, many countries have liberalized their electricity market. This power markets liberalization has directed generation companies as well as wholesale buyers to undertake a greater intense risk exposure compared to the old centralized framework. In this framework, electricity price prediction has become crucial for any market player in their decision‐making process as well as strategic planning. In this study, a prototype asymmetric‐based neuro‐fuzzy network (AGFINN) architecture has been implemented for short‐term electricity prices forecasting for ISO New England market. AGFINN framework has been designed through two different defuzzification schemes. Fuzzy clustering has been explored as an initial step for defining the fuzzy rules while an asymmetric Gaussian membership function has been utilized in the fuzzification part of the model. Results related to the minimum and maximum electricity prices for ISO New England, emphasize the superiority of the proposed model over well‐established learning‐based models

    FORECASTING THE WORKLOAD WITH A HYBRID MODEL TO REDUCE THE INEFFICIENCY COST

    Get PDF
    Time series forecasting and modeling are challenging problems during the past decades, because of its plenty of properties and underlying correlated relationships. As a result, researchers proposed a lot of models to deal with the time series. However, the proposed models such as Autoregressive integrated moving average (ARIMA) and artificial neural networks (ANNs) only describe part of the properties of time series. In this thesis, we introduce a new hybrid model integrated filter structure to improve the prediction accuracy. Case studies with real data from University of Kentucky HealthCare are carried out to examine the superiority of our model. Also, we applied our model to operating room (OR) to reduce the inefficiency cost. The experiment results indicate that our model always outperforms compared with other models in different conditions

    Data analytics enhanced component volatility model

    Get PDF
    Volatility modelling and forecasting have attracted many attentions in both finance and computation areas. Recent advances in machine learning allow us to construct complex models on volatility forecasting. However, the machine learning algorithms have been used merely as additional tools to the existing econometrics models. The hybrid models that specifically capture the characteristics of the volatility data have not been developed yet. We propose a new hybrid model, which is constructed by a low-pass filter, the autoregressive neural network and an autoregressive model. The volatility data is decomposed by the low-pass filter into long and short term components, which are then modelled by the autoregressive neural network and an autoregressive model respectively. The total forecasting result is aggregated by the outputs of two models. The experimental evaluations using one-hour and one-day realized volatility across four major foreign exchanges showed that the proposed model significantly outperforms the component GARCH, EGARCH and neural network only models in all forecasting horizons
    • 

    corecore