303 research outputs found
Testing the Assumptions Behind the Use of Importance Sampling
Importance sampling is used in many aspects of modern econometrics to approximate unsolvable integrals. Its reliable use requires the sampler to possess a variance, for this guarantees a square root speed of convergence and asymptotic normality of the estimator of the integral. However, this assumption is seldom checked. In this paper we propose to use extreme value theory to empirically assess the appropriateness of this assumption. We illustrate this method in the context of a maximum simulated likelihood analysis of the stochastic volatility model.Extreme value theory; Importance sampling; Simulation; Stochastic Volatility.
Diagnostic checking and intra-daily effects in time series models
A variety of topics on the statistical analysis of time
series are addressed in this thesis. The main emphasis is on the
state space methodology and, in particular, on structural time
series (STS) models. There are now many applications of STS models
in the literature and they have proved to be very successful.
The keywords of this thesis vary from - Kalman filter,
smoothing and diagnostic checking - to - time-varying cubic splines
and intra-daily effects -. Five separate studies are carried out for
this research project and they are reflected in the chapters 2 to 6.
All studies concern time series models which are placed in the state
space form (SSF) so that the Kalman filter (KF) can be applied for
estimation. The SSF and the KF play a central role in time series
analysis that can be compared with the important role of the
regression model and the method of least squares estimation in
econometrics. Chapter 2 gives an overview of the latest developments
in the state space methodology including diffuse likelihood
evaluation, stable calculations, etc.
Smoothing algorithms evaluate the full sample estimates of
unobserved components in time series models. New smoothing
algorithms are developed for the state and the disturbance vector of
the SSF which are computationally efficient and outperform existing
methods. Chapter 3 discusses the existing and the new smoothing
algorithms with an emphasis on theory, algorithms and practical
implications. The new smoothing results pave the way to use
auxiliary residuals, that is full sample estimates of the
disturbances, for diagnostic checking of unobserved components time
series models. Chapter 4 develops test statistics for auxiliary
residuals and it presents applications showing how they can be used
to detect and distinguish between outliers and structural change.
A cubic spline is a polynomial function of order three which
is regularly used for interpolation and curve-fitting. It has also
been applied to piecewise regressions, density approximations, etc.
Chapter 5 develops the cubic spline further by allowing it to vary
over time and by introducing it into time series models. These timevarying
cubic splines are an efficient way of handling slowly
changing periodic movements in time series.
This method for modelling a changing periodic pattern is
applied in a structural time series model used to forecast hourly
electricity load demand, with the periodic movements being intradaily
or intra-weekly. The full model contains other components,
including a temperature response which is also modelled using cubic
splines. A statistical computer package (SHELF) is developed to
produce, at any time, hourly load forecasts three days ahead
Maximum likelihood estimation of stochastic volatility models
This paper discusses the Monte Carlo maximum likelihood method of estimating stochastic volatility (SV) models. The basic SV model can be expressed as a linear state space model with log chi-square disturbances. The likelihood function can be approximated arbitrarily accurately by decomposing it into a Gaussian part, constructed by the Kalman filter, and a remainder function, whose expectation is evaluated by simulation. No modifications of this estimation procedure are required when the basic SV model is extended in a number of directions likely to arise in applied empirical research. This compares favorably with alternative approaches. The finite sample performance of the new estimator is shown to be comparable to the Monte Carlo Markov chain (MCMC) method
Systemic risk diagnostics: coincident indicators and early warning signals
We propose a novel framework to assess financial system risk. Using a dynamic factor framework based on state-space methods, we construct coincident measures (‘thermometers’) and a forward looking indicator for the likelihood of simultaneous failure of a large number of financial intermediaries. The indicators are based on latent macro-financial and credit risk components for a large data set comprising the U.S., the EU-27 area, and the respective rest of the world. Credit risk conditions can significantly and persistently de-couple from macro-financial fundamentals. Such decoupling can serve as an early warning signal for macro-prudential policy. JEL Classification: G21, C33credit portfolio models, financial crisis, frailty-correlated defaults, state space methods, systemic risk
Extracting a Robust U.S. Business Cycle Using a Time-Varying Multivariate Model-Based Bandpass Filter
In this paper we investigate whether the dynamic properties of the U.S. business cycle have changed in the last fifty years. For this purpose we develop a flexible business cycle indicator that is constructed from a moderate set of macroeconomic time series. The coincident economic indicator is based on a multivariate trend-cycle decomposition model that accounts for time variation in macroeconomic volatility, known as the great moderation. In particular, we consider an unobserved components time series model with a common cycle that is shared across different time series but adjusted for phase shift and amplitude. The extracted cycle can be interpreted as the result of a model-based bandpass filter and is designed to emphasize the business cycle frequencies that are of interest to applied researchers and policymakers. Stochastic volatility processes and mixture distributions for the irregular components and the common cycle disturbances enable us to account for all the heteroskedasticity present in the data. The empirical results are based on a Bayesian analysis and show that time-varying volatility is only present in the a selection of idiosyncratic components while the coefficients driving the dynamic properties of the business cycle indicator have been stable over time in the last fifty years.
A General Framework for Observation Driven Time-Varying Parameter Models
We propose a new class of observation driven time series models that we refer to as Generalized Autoregressive Score (GAS) models. The driving mechanism of the GAS model is the scaled likelihood score. This provides a unified and consistent framework for introducing time-varying parameters in a wide class of non-linear models. The GAS model encompasses other well-known models such as the generalized autoregressive conditional heteroskedasticity, autoregressive conditional duration, autoregressive conditional intensity and single source of error models. In addition, the GAS specification gives rise to a wide range of new observation driven models. Examples include non-linear regression models with time-varying parameters, observation driven analogues of unobserved components time series models, multivariate point process models with time-varying parameters and pooling restrictions, new models for time-varying copula functions and models for time-varying higher order moments. We study the properties of GAS models and provide several non-trivial examples of their application.dynamic models, time-varying parameters, non-linearity, exponential family, marked point processes, copulas
Periodic Heteroskedastic RegARFIMA models for daily electricity spot prices
In this paper we consider different periodic extensions of regression models with autoregressive fractionally integrated moving average disturbances for the analysis of daily spot prices of electricity. We show that day-of-the-week periodicity and long memory are important determinants for the dynamic modelling of the conditional mean of electricity spot prices. Once an effective description of the conditional mean of spot prices is empirically identified, focus can be directed towards volatility features of the time series. For the older electricity market of Nord Pool in Norway, it is found that a long memory model with periodic coefficients is required to model daily spot prices effectively. Further, strong evidence of conditional heteroskedasticity is found in the mean corrected Nord Pool series. For daily prices at three emerging electricity markets that we consider (APX in The Netherlands, EEX in Germany and Powernext in France) periodicity in the autoregressive coefficients is also stablished, but evidence of long memory is not found and existence of dynamic behaviour in the variance of the spot prices is less pronounced. The novel findings in this paper can have important consequences for the modelling and forecasting of mean and variance functions of spot prices for electricity and associated contingent assetsGARCH, Long Memory
Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models
We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the standard multivariate approach one can interpret periodic time series modelling as a simultaneous analysis of a set of, traditionally, yearly time series where each series is related to a particular season, with a time index in years. Our analysis applies to monthly vector time series related to each day of the month. We focus on forecasting performance and the underlying periodic forecast function, defined by the in-sample observation weights for producing (multi-step) forecasts. These weights facilitate the interpretation of periodic model extensions. We take a statistical state space approach to estimate our model, so that we can identify stochastic unobserved components and we can deal with irregularly spaced time series. We extend existing algorithms to compute observation weights for forecasting based on state space models with regressor variables. Our methods are illustrated by an application to time series of clearly periodic daily Dutch tax revenues. The dimension of our model is large as we allow the time series for each day of the month to be subject to a changing seasonal pattern. Nevertheless, even with only five years of data we find that increased periodic flexibility helps help in simulated out-of-sample forecasting for two extra years of data
Credit cycles and macro fundamentals
We study the relation between the credit cycle and macro economic fundamentals in an intensity based framework. Using rating transition and default data of U.S. corporates from Standard and Poor’s over the period 1980–2005 we directly estimate the credit cycle from the micro rating data. We relate this cycle to the business cycle, bank lending conditions, and financial market variables. In line with earlier studies, these variables appear to explain part of the credit cycle. As our main contribution, we test for the correct dynamic specification of these models. In all cases, the hypothesis of correct dynamic specification is strongly rejected. Moreover, accounting for dynamic mis-specification, many of the variables thought to explain the credit cycle, turn out to be insignificant. The main exceptions are GDP growth, and to some extent stock returns and stock return volatilities. Their economic significance appears low, however. This raises the puzzle of what macro-economic fundamentals explain default and rating dynamics. JEL Classification: G11, G2
- …