46 research outputs found

    Modeling electricity prices with regime switching models

    Get PDF
    We address the issue of modeling spot electricity prices with regime switching models. After reviewing the stylized facts about power markets we propose and fit various models to spot prices from the Nordic power exchange. Afterwards we assess their performance by comparing simulated and market prices.Power market, Electricity price modeling, Regime switching model

    Outlier Treatment and Robust Approaches for Modeling Electricity Spot Prices

    Get PDF
    We investigate the effects of outlier treatment on the estimation of the seasonal component and stochastic models in electricity markets. Typically, electricity spot prices exhibit features like seasonality, mean-reverting behavior, extreme volatility and the occurrence of jumps and spikes. Hence, an important issue in the estimation of stochastic models for electricity spot prices is the estimation of a component to deal with trends and seasonality in the data. Unfortunately, in regression analysis, classical estimation routines like OLS are very sensitive to extreme observations and outliers. Improved robustness of the model can be achieved by (a) cleaning the data with some reasonable procedure for outlier rejection, and then (b) using classical estimation and testing procedures on the remainder of the data. We examine the effects on model estimation for different treatment of extreme observations in particular on determining the number of outliers and descriptive statistics of the remaining series after replacement of the outliers. Our findings point out the substantial impact the treatment of extreme observations may have on these issues.Electricity; price modeling; seasonal decomposition; price spike

    Modelling catastrophe claims with left-truncated severity distributions (extended version)

    Get PDF
    In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies. This is an extended version of the article: Chernobai et al. (2006) Modelling catastrophe claims with left-truncated severity distributions, Computational Statistics 21(3-4): 537-555.Natural Catastrophe, Property Insurance, Loss Distribution, Truncated Data, Ruin Probability

    Modeling catastrophe claims with left-truncated severity distributions (extended version)

    Get PDF
    In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies.Natural catastrophe; Property insurance; Loss distribution; Truncated data; Ruin probability;

    The Diversification Delta: A different perspective

    Get PDF
    Vermorken et al. (2012) introduce a new measure of diversication, the Diversication Delta based on the empirical entropy. The entropy as a measure of uncertainty has successfully been used in several frameworks and takes into account the uncertainty related to the entire statistical distribution and not just the rst two moments of a distribution. However, the suggested Diversication Delta measure has a number of drawbacks that we highlight in this article. We also propose an alternative measure based on the exponential entropy which overcomes the identied shortcomings. We present the properties of this new measure and illustrate its usefulness in an empirical example of a portfolio of U.S. stocks and bonds

    Climate Adaptation decision support Tool for Local Governments: CATLoG

    Get PDF
    AbstractThe Intergovernmental Panel on Climate Change (IPCC), the globally-recognised reference body for climate-related research, describes warming of the climate system as ‘unequivocal’. The changing climate is likely to result in the occurrence of more frequent and intense extreme weather events. This demands preventative and preparatory actions (mitigation and adaptation) from all levels of government including local governments. No matter how robust the mitigation responses will be, adaptation actions will still be required to prepare for the already committed changes on the climate.The study of climate extremes is particularly important because of their high impact nature. Analysis of the extreme events are challenging because of their rare occurrences resulting in very few past observations that can help in any statistical analysis or conclusions. Currently available climate projections especially for extreme events at local scales are associated with a wide range of uncertainties. Apart from that, analysis and damage assessment of the extremes over a period of time also present a lot of uncertainties related to economic analysis (e.g. discount rate, growth rate) and the unknown future.Unfortunately, often end users do not understand the range of uncertainties surrounding the research outputs they use for extreme events. This research project was designed to develop a pilot tool to enable end users to analyse and prepare for extreme events in a less predictable, complex world. Due to the lack of historical data, the tool relies on expert judgements on the frequency and severity of such events. It is important to point out that the results of the analysis are highly dependent on the quality of these judgements such that the reliability of the results depends on finding appropriate experts in the field who can provide appropriate estimates for frequency and impact of the considered events. The Tool uses a combination of quantitative (Cost-Benefit Analysis) and qualitative (Multi-Criteria Analysis) methods to frame the decision support Tool. The current version of the Tool allows users to conduct sensitivity tests, examine the impact of uncertain parameters ranging from climate impacts to discount rates. The final product is a user-friendly decision tool in the form of an Excel add-in together with a user manual booklet that demonstrates sample worked out projects. The Tool is made flexible so that stakeholders can adopt or refine or upgrade it for their context specific applications.The Intergovernmental Panel on Climate Change (IPCC), the globally-recognised reference body for climate-related research, describes warming of the climate system as ‘unequivocal’. The changing climate is likely to result in the occurrence of more frequent and intense extreme weather events. This demands preventative and preparatory actions (mitigation and adaptation) from all levels of government including local governments. No matter how robust the mitigation responses will be, adaptation actions will still be required to prepare for the already committed changes on the climate.The study of climate extremes is particularly important because of their high impact nature. Analysis of the extreme events are challenging because of their rare occurrences resulting in very few past observations that can help in any statistical analysis or conclusions. Currently available climate projections especially for extreme events at local scales are associated with a wide range of uncertainties. Apart from that, analysis and damage assessment of the extremes over a period of time also present a lot of uncertainties related to economic analysis (e.g. discount rate, growth rate) and the unknown future.Unfortunately, often end users do not understand the range of uncertainties surrounding the research outputs they use for extreme events. This research project was designed to develop a pilot tool to enable end users to analyse and prepare for extreme events in a less predictable, complex world. Due to the lack of historical data, the tool relies on expert judgements on the frequency and severity of such events. It is important to point out that the results of the analysis are highly dependent on the quality of these judgements such that the reliability of the results depends on finding appropriate experts in the field who can provide appropriate estimates for frequency and impact of the considered events. The Tool uses a combination of quantitative (Cost-Benefit Analysis) and qualitative (Multi-Criteria Analysis) methods to frame the decision support Tool. The current version of the Tool allows users to conduct sensitivity tests, examine the impact of uncertain parameters ranging from climate impacts to discount rates. The final product is a user-friendly decision tool in the form of an Excel add-in together with a user manual booklet that demonstrates sample worked out projects. The Tool is made flexible so that stakeholders can adopt or refine or upgrade it for their context specific applications.Please cite this report as:Trueck, S, Mathew, S, Henderson-Sellers, A, Taplin, R, Keighley, T, Chin, W 2013 Climate Adaptation Decision Support Tool for Local Governments: CATLog: Developing an Excel Spreadsheet Tool for Local Governments to compare and prioritise investment in climate change adaptation, National Climate Change Adaptation Research Facility, Gold Coast, pp. 39

    Optimal dynamic climate adaptation pathways: a case study of New York City

    Full text link
    Assessing climate risk and its potential impacts on our cities and economies is of fundamental importance. Extreme weather events, such as hurricanes, floods, and storm surges can lead to catastrophic damages. We propose a flexible approach based on real options analysis and extreme value theory, which enables the selection of optimal adaptation pathways for a portfolio of climate adaptation projects. We model the severity of extreme sea level events using the block maxima approach from extreme value theory, and then develop a real options framework, factoring in climate change, sea level rise uncertainty, and the growth in asset exposure. We then apply the proposed framework to a real-world problem, considering sea level data as well as different adaptation investment options for New York City. Our research can assist governments and policy makers in taking informed decisions about optimal adaptation pathways and more specifically about reducing flood and storm surge risk in a dynamic settings.Comment: 29 pages, 5 figures, and 4 table

    Identifying spikes and seasonal components in electricity spot price data: A guide to robust modeling

    Get PDF
    An important issue in fitting stochastic models to electricity spot prices is the estimation of a component to deal with trends and seasonality in the data. Unfortunately, estimation routines for the long-term and short-term seasonal pattern are usually quite sensitive to extreme observations, known as electricity price spikes. Improved robustness of the model can be achieved by (a) filtering the data with some reasonable procedure for outlier detection, and then (b) using estimation and testing procedures on the filtered data. In this paper we examine the effects of different treatment of extreme observations on model estimation and on determining the number of spikes (outliers). In particular we compare results for the estimation of the seasonal and stochastic components of electricity spot prices using either the original or filtered data. We find significant evidence for a superior estimation of both the seasonal short-term and long-term components when the data have been treated carefully for outliers. Overall, our findings point out the substantial impact the treatment of extreme observations may have on these issues and, therefore, also on the pricing of electricity derivatives like futures and option contracts. An added value of our study is the ranking of different filtering techniques used in the energy economics literature, suggesting which methods could be and which should not be used for spike identification

    Modelling catastrophe claims with left-truncated severity distributions (extended version)

    Get PDF
    In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies. This is an extended version of the article: Chernobai et al. (2006) Modelling catastrophe claims with left-truncated severity distributions, Computational Statistics 21(3-4): 537-555

    Modelling catastrophe claims with left-truncated severity distributions (extended version)

    Get PDF
    In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies. This is an extended version of the article: Chernobai et al. (2006) Modelling catastrophe claims with left-truncated severity distributions, Computational Statistics 21(3-4): 537-555
    corecore