91 research outputs found
A Dynamic Integer Count Data Model for Financial Transaction Prices
In this paper we develop a dynamic model for integer counts to capture the discreteness of price changes for financial transaction prices. Our model rests on an autoregressive multinomial component for the direction of the price change and a dynamic count data component for the size of the price changes. Since the model is capable of capturing a wide range of discrete price movements it is particularly suited for financial markets where the trading intensity is moderate or low as for most European exchanges. We present the model at work by applying it to transaction data of the Henkel share traded at the Frankfurt stock exchange over a period of 6 months. In particular, we use the model to test some theoretical implications of the market microstructure theory on the relationship between price movements and other marks of the trading process.Autoregressive conditional multinomial model, GLARMA, transaction prices, count data, market microstructure
Trading volume and the short and long-run components of volatility
This paper investigates the Information content of daily trading volume with respect to the long-run or high persistent and the short-run or transitory components of the volatility of daily stock market returns using bivariate mixture models. For this purpose, the Standard bivariate mixture model of Tauchen and Pitts (1983) in which volatility and volume are directed by one latent process of Information arrivals is generalized to the extent that two types of information processes each endowed with their own dynamic behavior are allowed to direct volatility and volume. Since the latent information processes are assumed to be autocorrelated which makes standard estimation methods infeasible, a simulated maximum Iikelihood approach is applied to estimate the mixture models. The results based on German stock market data reveal that volume mainly provides information about the transitory com-ponent of volatility, and contains only little information about the high persistent volatility component
Dynamic bivariate mixture models: Modeling the behavior of prices and trading volume
Bivariate mixture models have been used to explain the stochastic behavior of daily price changes and trading volume on fmancial markets. In this class of models price changes and volume follow a mixture of bivariate distributions with the unobservable number of price relevant information serving as the mixing variable. The time series behavior of this mi-xing variable determines the dynamics of the price-volume system. In this paper, bivariate mixture specifications with a serially correlated mixing variable are estimated by simula-ted maximum likelihood and analyzed concerning their ability to account for the observed dynamics on financial markets, especially the persistence in the variance of price changes. The results based on German stock market data reveal that the dynamic bivariate mixture models cannot account for the persistence in the price change variance
Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models
In this paper, Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCDC) posterior analysis of the parameters of SV models can be performed.
Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models
In this paper Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCMC) posterior analysis of the parameters of SV models can be performed. --Dynamic Latent Variables,Markov Chain Monte Carlo,Maximum likelihood,Simulation Smoother
Improving MCMC Using Efficient Importance Sampling
This paper develops a systematic Markov Chain Monte Carlo (MCMC) framework based upon Efficient Importance Sampling (EIS) which can be used for the analysis of a wide range of econometric models involving integrals without an analytical solution. EIS is a simple, generic and yet accurate Monte-Carlo integration procedure based on sampling densities which are chosen to be global approximations to the integrand. By embedding EIS within MCMC procedures based on Metropolis-Hastings (MH) one can significantly improve their numerical properties, essentially by providing a fully automated selection of critical MCMC components such as auxiliary sampling densities, normalizing constants and starting values. The potential of this integrated MCMC- EIS approach is illustrated with simple univariate integration problems and with the Bayesian posterior analysis of stochastic volatility models and stationary autoregressive processes. --Autoregressive models,Bayesian posterior analysis,Dynamic latent variables,Gibbs sampling,Metropolis Hastings,Stochastic volatility
The Multinomial Multiperiod Probit Model: Identification and Efficient Estimation
In this paper we discuss parameter identification and likelihood evaluation for multinomial multiperiod Probit models. It is shown in particular that the standard autoregressive specification used in the literature can be interpreted as a latent common factor model. However, this specification is not invariant with respect to the selection of the baseline category. Hence, we propose an alternative specification which is invariant with respect to such a selection and identifies coefficients characterizing the stationary covariance matrix which are not identified in the standard approach. For likelihood evaluation requiring high-dimensional truncated integration we propose to use a generic procedure known as Efficient Importance Sampling (EIS). A special case of our proposed EIS algorithm is the standard GHK probability simulator. To illustrate the relative performance of both procedures we perform a set Monte-Carlo experiments. Our results indicate substantial numerical e?ciency gains of the ML estimates based on GHK-EIS relative to ML estimates obtained by using GHK. --Discrete choice,Importance sampling,Monte-Carlo integration,Panel data,Parameter identification,Simulated maximum likelihood
Efficient high-dimensional importance sampling in mixture frameworks
This paper provides high-dimensional and flexible importance sampling procedures for the likelihood evaluation of dynamic latent variable models involving finite or infinite mixtures leading to possibly heavy tailed and/or multi-modal target densities. Our approach is based upon the efficient importance sampling (EIS) approach of Richard and Zhang (2007) and exploits the mixture structure of the model when constructing importance sampling distributions as mixture of distributions. The proposed mixture EIS procedures are illustrated with ML estimation of a student-t state space model for realized volatilities and a stochastic volatility model with leverage effects and jumps for asset returns. --dynamic latent variable model,importance sampling,marginalized likelihood,mixture,Monte Carlo,realized volatility,stochastic volatility
Time Series of Count Data : Modelling and Estimation
This paper compares various models for time series of counts which can account for discreetness, overdispersion and serial correlation. Besides observation- and parameter-driven models based upon corresponding conditional Poisson distributions, we also consider a dynamic ordered probit model as a flexible specification to capture the salient features of time series of counts. For all models, we present appropriate efficient estimation procedures. For parameter-driven specifications this requires Monte Carlo procedures like simulated Maximum likelihood or Markov Chain Monte-Carlo. The methods including corresponding diagnostic tests are illustrated with data on daily admissions for asthma to a single hospital. --Efficient Importance Sampling,GLARMA,Markov Chain Monte-Carlo,Observation-driven model,Parameter-driven model,Ordered Probit
The Decline in German Output Volatility: A Bayesian Analysis
Empirical evidence suggests a sharp volatility decline of the growth in U.S. gross domestic product (GDP) in the mid-1980s. Using Bayesian methods, we analyze whether a volatility reduction can also be detected for the German GDP. Since statistical inference for volatility processes critically depends on the specification of the conditional mean we assume for our volatility analysis different time series models for GDP growth. We find across all specifications evidence for an output stabilization around 1993, after the downturn following the boom associated with the German reunification. However, the different GDP models lead to alternative characterizations of this stabilization : In a linear AR model it shows up as smaller shocks hitting the economy, while regime switching models reveal as further sources for a stabilization, a narrowing gap between growth rates during booms and recessions or flatter trajectories characterizing the GDP growth rates. Furthermore, it appears that the reunification interrupted an output stabilization emerging already around 1987. --business cycle models,Gibbs sampling,Markov Chain Monte Carlo,regime switching,structural breaks
- …