5,439 research outputs found

    Value at risk for a mixture of normal distributions: the use of quasi- Bayesian estimation techniques

    Get PDF
    This article proposes a methodology for measuring value at risk for fat-tailed asset return distributions. Simulation-based results indicate that this approach provides better estimates of risk than one based on the assumption that asset returns are normally distributed.Econometric models ; Risk

    The GPU vs Phi Debate: Risk Analytics Using Many-Core Computing

    Get PDF
    The risk of reinsurance portfolios covering globally occurring natural catastrophes, such as earthquakes and hurricanes, is quantified by employing simulations. These simulations are computationally intensive and require large amounts of data to be processed. The use of many-core hardware accelerators, such as the Intel Xeon Phi and the NVIDIA Graphics Processing Unit (GPU), are desirable for achieving high-performance risk analytics. In this paper, we set out to investigate how accelerators can be employed in risk analytics, focusing on developing parallel algorithms for Aggregate Risk Analysis, a simulation which computes the Probable Maximum Loss of a portfolio taking both primary and secondary uncertainties into account. The key result is that both hardware accelerators are useful in different contexts; without taking data transfer times into account the Phi had lowest execution times when used independently and the GPU along with a host in a hybrid platform yielded best performance.Comment: A modified version of this article is accepted to the Computers and Electrical Engineering Journal under the title - "The Hardware Accelerator Debate: A Financial Risk Case Study Using Many-Core Computing"; Blesson Varghese, "The Hardware Accelerator Debate: A Financial Risk Case Study Using Many-Core Computing," Computers and Electrical Engineering, 201

    Risk measurement: an introduction to value at risk

    Get PDF
    This paper is a self-contained introduction to the concept and methodology of "value at risk," which is a new tool for measuring an entity's exposure to market risk. We explain the concept of value at risk, and then describe in detail the three methods for computing it: historical simulation; the variance-covariance method; and Monte Carlo or stochastic simulation. We then discuss the advantages and disadvantages of the three methods for computing value at risk. Finally, we briefly describe some alternative measures of market risk.Risk and Uncertainty,

    Parallel Simulations for Analysing Portfolios of Catastrophic Event Risk

    Full text link
    At the heart of the analytical pipeline of a modern quantitative insurance/reinsurance company is a stochastic simulation technique for portfolio risk analysis and pricing process referred to as Aggregate Analysis. Support for the computation of risk measures including Probable Maximum Loss (PML) and the Tail Value at Risk (TVAR) for a variety of types of complex property catastrophe insurance contracts including Cat eXcess of Loss (XL), or Per-Occurrence XL, and Aggregate XL, and contracts that combine these measures is obtained in Aggregate Analysis. In this paper, we explore parallel methods for aggregate risk analysis. A parallel aggregate risk analysis algorithm and an engine based on the algorithm is proposed. This engine is implemented in C and OpenMP for multi-core CPUs and in C and CUDA for many-core GPUs. Performance analysis of the algorithm indicates that GPUs offer an alternative HPC solution for aggregate risk analysis that is cost effective. The optimised algorithm on the GPU performs a 1 million trial aggregate simulation with 1000 catastrophic events per trial on a typical exposure set and contract structure in just over 20 seconds which is approximately 15x times faster than the sequential counterpart. This can sufficiently support the real-time pricing scenario in which an underwriter analyses different contractual terms and pricing while discussing a deal with a client over the phone.Comment: Proceedings of the Workshop at the International Conference for High Performance Computing, Networking, Storage and Analysis (SC), 2012, 8 page

    Asset Management in Volatile Markets

    Get PDF
    The 27th SUERF Colloquium in Munich in June 2008: New Trends in Asset Management: Exploring the Implications was already topical in the Summer of 2008. The subsequent dramatic events in the Autumn of 2008 made the presentations in Munich even more relevant to investors and bankers that want to understand what happens in their investment universe. In the present SUERF Study, we have collected a sample of outstanding colloquium contributions under the fitting headline: Asset Management in Volatile Markets.derivatives, financial innovation, asset management, finance-growth-nexus; Relative Value Strategy, Pair Trading, Slippage, Implementation Shortfall, Asset Management, Fin4Cast

    Why VAR Fails: Long Memory and Extreme Events in Financial Markets

    Get PDF
    The Value-at-Risk (VAR) measure is based on only the second moment of a rates of return distribution. It is an insufficient risk performance measure, since it ignores both the higher moments of the pricing distributions, like skewness and kurtosis, and all the fractional moments resulting from the long - term dependencies (long memory) of dynamic market pricing. Not coincidentally, the VaR methodology also devotes insufficient attention to the truly extreme financial events, i.e., those events that are catastrophic and that are clustering because of this long memory. Since the usual stationarity and i.i.d. assumptions of classical asset returns theory are not satisfied in reality, more attention should be paid to the measurement of the degree of dependence to determine the true risks to which any investment portfolio is exposed: the return distributions are time-varying and skewness and kurtosis occur and change over time. Conventional mean-variance diversification does not apply when the tails of the return distributions ate too fat, i.e., when many more than normal extreme events occur. Regrettably, also, Extreme Value Theory is empirically not valid, because it is based on the uncorroborated i.i.d. assumption.Long memory, Value at Risk, Extreme Value Theory, Portfolio Management, Degrees of Persistence

    The optimal currency composition of external debt

    Get PDF
    The increased volatility of exchange rates, interest rates and goods prices has focused fresh attention on the importance for developing countries of reducing their risks in these markets. Although, these countries generally cannotuse such conventional hedging instruments as currency and commodity futures, they can use the currency composition of their external debt to hedge against exchange rates and commodity prices. In this line, this paper uses findings from the literature on optimal portfolio theory to discuss the optimal currency composition of external debt. The analysis considers a small open economy facing a perfect world capital market and a large number of perfect commodity markets. The paper derives the optimal currency composition of the country's aggregate assets and external liabilities and describes the necessary estimations and computations, including how to take into account the currency composition of existing external liabilities.Economic Theory&Research,Environmental Economics&Policies,Fiscal&Monetary Policy,TF054105-DONOR FUNDED OPERATION ADMINISTRATION FEE INCOME AND EXPENSE ACCOUNT,Financial Intermediation

    Asset pricing and investor risk in subordinated asset securitisation

    Get PDF
    As a sign of ambivalence in the regulatory definition of capital adequacy for credit risk and the quest for more efficient refinancing sources collateral loan obligations (CLOs) have become a prominent securitisation mechanism. This paper presents a loss-based asset pricing model for the valuation of constituent tranches within a CLO-style security design. The model specifically examines how tranche subordination translates securitised credit risk into investment risk of issued tranches as beneficial interests on a designated loan pool typically underlying a CLO transaction. We obtain a tranchespecific term structure from an intensity-based simulation of defaults under both robust statistical analysis and extreme value theory (EVT). Loss sharing between issuers and investors according to a simplified subordination mechanism allows issuers to decompose securitised credit risk exposures into a collection of default sensitive debt securities with divergent risk profiles and expected investor returns. Our estimation results suggest a dichotomous effect of loss cascading, with the default term structure of the most junior tranche of CLO transactions (“first loss position”) being distinctly different from that of the remaining, more senior “investor tranches”. The first loss position carries large expected loss (with high investor return) and low leverage, whereas all other tranches mainly suffer from loss volatility (unexpected loss). These findings might explain why issuers retain the most junior tranche as credit enhancement to attenuate asymmetric information between issuers and investors. At the same time, the issuer discretion in the configuration of loss subordination within particular security design might give rise to implicit investment risk in senior tranches in the event of systemic shocks. JEL Classifications: C15, C22, D82, F34, G13, G18, G2
    corecore