775 research outputs found
Financial Risk Measurement for Financial Risk Management
Current practice largely follows restrictive approaches to market risk measurement, such as historical simulation or RiskMetrics. In contrast, we propose flexible methods that exploit recent developments in financial econometrics and are likely to produce more accurate risk assessments, treating both portfolio-level and asset-level analysis. Asset-level analysis is particularly challenging because the demands of real-world risk management in financial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress powerful yet parsimonious models that are easily estimated. In addition, we emphasize the need for deeper understanding of the links between market risk and macroeconomic fundamentals, focusing primarily on links among equity return volatilities, real growth, and real growth volatilities. Throughout, we strive not only to deepen our scientific understanding of market risk, but also cross-fertilize the academic and practitioner communities, promoting improved market risk measurement technologies that draw on the best of both.Market risk, volatility, GARCH
Overnight Risk Model: A Unique Capability
We present a novel risk measurement model capable of capturing overnight risk i.e. the risk encountered between the closing time of the previous day and the opening time of the next day. The risk model captures both the overnight risk and also the intraday risk. Statistical models of intraday asset returns must separate the market opening period from the remainder of the day as these follow statistical laws with different properties. Here we present results showing our two models for these two distinct periods
Recommended from our members
Quantile-based methods for prediction, risk measurement and inference
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.The focus of this thesis is on the employment of theoretical and practical quantile methods in addressing prediction, risk measurement and inference problems. From a prediction perspective, a problem of creating model-free prediction intervals for a future unobserved value of a random variable drawn from a sample distribution is considered. With the objective of reducing prediction coverage error, two common distribution transformation methods based on the normal and exponential distributions are presented and they are theoretically demonstrated to attain exact and error-free prediction intervals respectively.
The second problem studied is that of estimation of expected shortfall via kernel smoothing. The goal here is to introduce methods that will reduce the estimation bias of expected shortfall. To this end, several one-step bias correction expected shortfall estimators are presented and investigated via simulation studies and compared with one-step estimators.
The third problem is that of constructing simultaneous confidence bands for quantile regression functions when the predictor variables are constrained within a region is considered. In this context, a method is introduced that makes use of the asymmetric Laplace errors in conjunction with a simulation based algorithm to create confidence bands for quantile and interquantile regression functions. Furthermore, the simulation approach is extended to an ordinary least square framework to build simultaneous bands for quantiles functions of the classical regression model when the model errors are normally distributed and when this assumption is not fulfilled.
Finally, attention is directed towards the construction of prediction intervals for realised volatility exploiting an alternative volatility estimator based on the difference of two extreme quantiles. The proposed approach makes use of AR-GARCH procedure in order to model time series of intraday quantiles and forecast intraday returns predictive distribution. Moreover, two simple adaptations of an existing model are also presented
Why and how to integrate liquidity risk into a VaR-framework
We integrate liquidity risk measured by the weighted spread into a Value-at-Risk (VaR) framework. The weighted spread measure extracts liquidity costs by order size from the limit order book. We show that it is precise from a risk perspective in a wide range of clearly defined situations. Using a unique, representative data set provided by Deutsche Boerse AG, we find liquidity risk to increase traditionally-measured price risk by over 25%, even at standard 10-day horizons and for liquid DAX stocks. We also show that the common approach of simply adding liquidity risk to price risk substantially overestimates total risk because correlation between liquidity and price is neglected. Our results are robust with respect to changes in risk measure, to sample periods and to effects of portfolio diversification. --asset liquidity,price impact,weighted spread,Xetra Liquidity Measure (XLM),Value-at-Risk,market liquidity risk
Information entropy and measures of market risk
In this paper we investigate the relationship between the information entropy of the distribution of intraday returns and intraday and daily measures of market risk. Using data on the EUR/JPY exchange rate, we find a negative
relationship between entropy and intraday Value-at-Risk,
and also between entropy and intraday Expected Shortfall. This relationship is then used to forecast daily Value-at-Risk, using the entropy of the distribution of intraday returns as a predictor
Liquidity effects of the events of September 11, 2001
Banks rely heavily on incoming payments from other banks to fund their own payments. The terrorist attacks of September 11, 2001, destroyed facilities in Lower Manhattan, leaving some banks unable to send payments through the Federal Reserve's Fedwire payments system. As a result, many banks received fewer payments than expected, causing unexpected shortfalls in banks' liquidity. These disruptions also made it harder for banks to redistribute balances across the banking system in a timely manner. In this article, the authors measure the payments responses of banks to the receipt of payments from other banks, both under normal circumstances and during the days following the attacks. Their analysis suggests that the significant injections of liquidity by the Federal Reserve, first through the discount window and later through open market operations, were important in allowing banks to reestablish their normal patterns of payments coordination.Fedwire ; Electronic funds transfers ; War - Economic aspects ; Bank liquidity ; Payment systems
A No-Arbitrage Approach to Range-Based Estimation of Return Covariances and Correlations
We extend the important idea of range-based volatility estimation to the multivariate case. In particular, we propose a range-based covariance estimator that is motivated by financial economic considerations (the absence of arbitrage), in addition to statistical considerations. We show that, unlike other univariate and multivariate volatility estimators, the range-based estimator is highly efficient yet robust to market microstructure noise arising from bid-ask bounce and asynchronous trading. Finally, we provide an empirical example illustrating the value of the high-frequency sample path information contained in the range-based estimates in a multivariate GARCH framework.Range-based estimation, volatility, covariance, correlation, absence of arbitrage, exchange rates, stock returns, bond returns, bid-ask bounce, asynchronous trading
- …