23,174 research outputs found
Computation of vector sublattices and minimal lattice-subspaces of R^k. Applications in finance
In this article we perform a computational study of Polyrakis algorithms
presented in [12,13]. These algorithms are used for the determination of the
vector sublattice and the minimal lattice-subspace generated by a finite set of
positive vectors of R^k. The study demonstrates that our findings can be very
useful in the field of Economics, especially in completion by options of
security markets and portfolio insurance.Comment: 22 page
Parallel Simulations for Analysing Portfolios of Catastrophic Event Risk
At the heart of the analytical pipeline of a modern quantitative
insurance/reinsurance company is a stochastic simulation technique for
portfolio risk analysis and pricing process referred to as Aggregate Analysis.
Support for the computation of risk measures including Probable Maximum Loss
(PML) and the Tail Value at Risk (TVAR) for a variety of types of complex
property catastrophe insurance contracts including Cat eXcess of Loss (XL), or
Per-Occurrence XL, and Aggregate XL, and contracts that combine these measures
is obtained in Aggregate Analysis.
In this paper, we explore parallel methods for aggregate risk analysis. A
parallel aggregate risk analysis algorithm and an engine based on the algorithm
is proposed. This engine is implemented in C and OpenMP for multi-core CPUs and
in C and CUDA for many-core GPUs. Performance analysis of the algorithm
indicates that GPUs offer an alternative HPC solution for aggregate risk
analysis that is cost effective. The optimised algorithm on the GPU performs a
1 million trial aggregate simulation with 1000 catastrophic events per trial on
a typical exposure set and contract structure in just over 20 seconds which is
approximately 15x times faster than the sequential counterpart. This can
sufficiently support the real-time pricing scenario in which an underwriter
analyses different contractual terms and pricing while discussing a deal with a
client over the phone.Comment: Proceedings of the Workshop at the International Conference for High
Performance Computing, Networking, Storage and Analysis (SC), 2012, 8 page
Copulas in finance and insurance
Copulas provide a potential useful modeling tool to represent the dependence structure
among variables and to generate joint distributions by combining given marginal
distributions. Simulations play a relevant role in finance and insurance. They are used to
replicate efficient frontiers or extremal values, to price options, to estimate joint risks, and so
on. Using copulas, it is easy to construct and simulate from multivariate distributions based
on almost any choice of marginals and any type of dependence structure. In this paper we
outline recent contributions of statistical modeling using copulas in finance and insurance.
We review issues related to the notion of copulas, copula families, copula-based dynamic and
static dependence structure, copulas and latent factor models and simulation of copulas.
Finally, we outline hot topics in copulas with a special focus on model selection and
goodness-of-fit testing
Variance Allocation and Shapley Value
Motivated by the problem of utility allocation in a portfolio under a
Markowitz mean-variance choice paradigm, we propose an allocation criterion for
the variance of the sum of possibly dependent random variables. This
criterion, the Shapley value, requires to translate the problem into a
cooperative game. The Shapley value has nice properties, but, in general, is
computationally demanding. The main result of this paper shows that in our
particular case the Shapley value has a very simple form that can be easily
computed. The same criterion is used also to allocate the standard deviation of
the sum of random variables and a conjecture about the relation of the
values in the two games is formulated.Comment: 20page
A machine learning approach to portfolio pricing and risk management for high-dimensional problems
We present a general framework for portfolio risk management in discrete
time, based on a replicating martingale. This martingale is learned from a
finite sample in a supervised setting. The model learns the features necessary
for an effective low-dimensional representation, overcoming the curse of
dimensionality common to function approximation in high-dimensional spaces. We
show results based on polynomial and neural network bases. Both offer superior
results to naive Monte Carlo methods and other existing methods like
least-squares Monte Carlo and replicating portfolios.Comment: 30 pages (main), 10 pages (appendix), 3 figures, 22 table
The GPU vs Phi Debate: Risk Analytics Using Many-Core Computing
The risk of reinsurance portfolios covering globally occurring natural
catastrophes, such as earthquakes and hurricanes, is quantified by employing
simulations. These simulations are computationally intensive and require large
amounts of data to be processed. The use of many-core hardware accelerators,
such as the Intel Xeon Phi and the NVIDIA Graphics Processing Unit (GPU), are
desirable for achieving high-performance risk analytics. In this paper, we set
out to investigate how accelerators can be employed in risk analytics, focusing
on developing parallel algorithms for Aggregate Risk Analysis, a simulation
which computes the Probable Maximum Loss of a portfolio taking both primary and
secondary uncertainties into account. The key result is that both hardware
accelerators are useful in different contexts; without taking data transfer
times into account the Phi had lowest execution times when used independently
and the GPU along with a host in a hybrid platform yielded best performance.Comment: A modified version of this article is accepted to the Computers and
Electrical Engineering Journal under the title - "The Hardware Accelerator
Debate: A Financial Risk Case Study Using Many-Core Computing"; Blesson
Varghese, "The Hardware Accelerator Debate: A Financial Risk Case Study Using
Many-Core Computing," Computers and Electrical Engineering, 201
- âŠ