29 research outputs found
A Two-Step Approach for Transforming Continuous Variables to Normal: Implications and Recommendations for IS Research
This article describes and demonstrates a two-step approach for transforming non-normally distributed continuous variables to become normally distributed. Step 1 involves transforming the variable into a percentile rank, which will result in uniformly distributed probabilities. The second step applies the inverse-normal transformation to the results of Step 1 to form a variable consisting of normally distributed z-scores. The approach is little-known outside the statistics literature, has been scarcely used in the social sciences, and has not been used in any IS study. The article illustrates how to implement the approach in Excel, SPSS, and SAS and explains implications and recommendations for IS research
Asymptotic Normality for EMS Option Price Estimator with Continuous or Discontinuous Payoff Functions
Empirical martingale simulation (EMS) was proposed by Duan and Simonato (Duan, J.-C., J.-G. Simonato. 1998. Empirical martingale simulation for asset prices. Management Sci. 44(9) 1218-1233) as an adjustment to the standard Monte Carlo simulation to reduce simulation errors. The EMS price estimator of derivative contracts was shown to be asymptotically normally distributed in Duan et al. (Duan, J.-C., G. Gauthier, J.-G. Simonato. 2001. Asymptotic distribution of the EMS option price estimator. Management Sci. 47(8) 1122-1132) when the payoffs are piecewise linear and continuous. In this paper, we extend the asymptotic normality result to more general continuous payoffs, and for discontinuous payoffs we make a conjecture.empirical martingale simulation, Monte Carlo, Black-Scholes, GARCH, options, regression analysis, asymptotic normality, coverage rate
Recommended from our members
Essays on the Modelling of S&P 500 Volatility
This dissertation studies the patterns of term-structure of implied volatility and examines the performance of different specifications of time-series and options-based volatility forecasting models under the influence of the observed market biases. Our research is based primarily upon the use of S&P 500 data for the period 1982-2002. There are three self-contained but seemingly related projects in this dissertation. The objectives of this research are: 1) to characterise the term-structure of implied volatility; 2) to compare the performance of asymmetric power ARCH and EGARCH models; 3) to evaluate the forecasting performance of time-series and options-based variance swap valuation models. The observed market anomalies in the term-structure of implied volatility of S&P 500 futures options are investigated between 1983 and 1998. Term-structure evidence indicates that short-term options are most severely mispriced by the Black-Scholes formula. We find evidence that option prices are not consistent with the rational expectations under a mean-reverting volatility process. In addition, skewness premiums results show that the degrees of anomalies in the S&P 500 options market have been gradually worsening since around 1987. As correlation may be responsible for skewness, our diagnostics suggest that leverage and jump-diffusion models are more appropriate for capturing the observed biases in the S&P 500 futures options market. Sixteen years of daily S&P 500 futures series are employed to examine the performance of the APARCH models that use asymmetric parameterisation and power transformation on conditional volatility and its absolute residual to account for the slow decay in returns autocorrelations. No evidence can be found supporting the relatively complex APARCH models. Log-likelihood ratio tests confirm that power transformation and asymmetric parameterisation are not effective in characterising the returns dynamics within the context of APARCH specifications. Furthermore, results of a 3-state regime-switching model support the notion that the performance of conditional volatility models is prone to the state of volatility of the returns series. In addition, AIC statistics stipulate that EGARCH is best in "noisy" periods whilst GARCH is the top performer in "quiet" periods. Overall, aggregated rankings for the AIC metric show that the EGARCH model is best. Options-based volatility trading exercises also reveal that EGARCH and GARCH can generate statistically significant ex-ante profit in one out of four sample periods after transactions costs. When considering a stochastic volatility model, there seems to be little incentive to look beyond a simple model which allows for volatility clustering and a leverage effect. The volatility forecasting performance of different specifications of time-series and options-based variance swap valuation models on the S&P 500 index is evaluated from three months before to after the 9/11 attacks. By far, the option-basedD emeterfi et al. (1999) variances wap valuation framework is the most popular tool to price variance swaps. This framework stipulates that pricing a variance swap can be viewed as an exercise in computing the weighted average of the implied volatility of the options required even under the influence of volatility skew. Our research design offers a comprehensive empirical study of the relative merits of competing option pricing models. Based on results from six carefully chosen contract days, we illustrate that implied models may overpredict future variance and underperform time-series models. The reasons could be: 1) the implied strategy was originally developed for hedging; 2) implied volatility is predominantly a monotonically decreasing function of maturity and therefore options-based strategy cannot produce enough variance term-structure patterns; 3) distributional dynamics implied by option parameters is not consistent with its time-series data as stipulated by the maximum likelihood estimation of the square-root process. Future research needs to use a larger sample set in order to establish a more statistically significant result to justify our findings. Until then we have a strong reservation about the use of Demeterfi et al. methodology for vari ance
forecastin
Generalized averaged Gaussian quadrature and applications
A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal
MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications
Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described
Risk Management for the Future
A large part of academic literature, business literature as well as practices in real life are resting on the assumption that uncertainty and risk does not exist. We all know that this is not true, yet, a whole variety of methods, tools and practices are not attuned to the fact that the future is uncertain and that risks are all around us. However, despite risk management entering the agenda some decades ago, it has introduced risks on its own as illustrated by the financial crisis. Here is a book that goes beyond risk management as it is today and tries to discuss what needs to be improved further. The book also offers some cases