9,846 research outputs found
Recommended from our members
Measuring the risk of a nonlinear portfolio with fat tailed risk factors through probability conserving transformation
This paper presents a new heuristic for fast approximation of VaR (Value-at-Risk) and CVaR (conditional Value-at-Risk) for financial portfolios, where the net worth of a portfolio is a non-linear function of possibly non-Gaussian risk factors. The proposed method is based on mapping non-normal marginal distributions into normal distributions via a probability conserving transformation and then using a quadratic, i.e. Delta–Gamma, approximation for the portfolio value. The method is very general and can deal with a wide range of marginal distributions of risk factors, including non-parametric distributions. Its computational load is comparable with the Delta–Gamma–Normal method based on Fourier inversion. However, unlike the Delta–Gamma–Normal method, the proposed heuristic preserves the tail behaviour of the individual risk factors, which may be seen as a significant advantage. We demonstrate the utility of the new method with comprehensive numerical experiments on simulated as well as real financial data
Estimating long range dependence: finite sample properties and confidence intervals
A major issue in financial economics is the behavior of asset returns over
long horizons. Various estimators of long range dependence have been proposed.
Even though some have known asymptotic properties, it is important to test
their accuracy by using simulated series of different lengths. We test R/S
analysis, Detrended Fluctuation Analysis and periodogram regression methods on
samples drawn from Gaussian white noise. The DFA statistics turns out to be the
unanimous winner. Unfortunately, no asymptotic distribution theory has been
derived for this statistics so far. We were able, however, to construct
empirical (i.e. approximate) confidence intervals for all three methods. The
obtained values differ largely from heuristic values proposed by some authors
for the R/S statistics and are very close to asymptotic values for the
periodogram regression method.Comment: 16 pages, 11 figures New version: 14 pages (smaller fonts), 11
figures, new Section on application
Shrinkage Confidence Procedures
The possibility of improving on the usual multivariate normal confidence was
first discussed in Stein (1962). Using the ideas of shrinkage, through Bayesian
and empirical Bayesian arguments, domination results, both analytic and
numerical, have been obtained. Here we trace some of the developments in
confidence set estimation.Comment: Published in at http://dx.doi.org/10.1214/10-STS319 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Experimental analysis of computer system dependability
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance
- …