20,915 research outputs found
The History of the Quantitative Methods in Finance Conference Series. 1992-2007
This report charts the history of the Quantitative Methods in Finance (QMF) conference from its beginning in 1993 to the 15th conference in 2007. It lists alphabetically the 1037 speakers who presented at all 15 conferences and the titles of their papers.
Multi-asset minority games
We study analytically and numerically Minority Games in which agents may invest in different assets (or markets), considering both the canonical and the grand-canonical versions. We find that the likelihood of agents trading in a given asset depends on the relative amount of information available in that market. More specifically, in the canonical game players play preferentially in the stock with less information. The same holds in the grand canonical game when agents have positive incentives to trade, whereas when agents payoff are solely related to their speculative ability they display a larger propensity to invest in the information-rich asset. Furthermore, in this model one finds a globally predictable phase with broken ergodicity
Haar Wavelets-Based Methods for Credit Risk Portfolio Modeling
In this dissertation we have investigated the credit risk measurement of a credit portfolio by means of the wavelets theory. Banks became subject to
regulatory capital requirements under Basel Accords and also to the supervisory review process of capital adequacy, this is the economic capital.
Concentration risks in credit portfolios arise from an unequal distribution of loans to single borrowers (name concentration) or different industry or
regional sectors (sector concentration) and may lead banks to face bankruptcy.
The Merton model is the basis of the Basel II approach, it is a Gaussian one-factor model such that default events are driven by a latent common factor
that is assumed to follow the Gaussian distribution. Under this model, loss only occurs when an obligor defaults in a fixed time horizon. If we assume
certain homogeneity conditions, this one-factor model leads to a simple analytical asymptotic approximation of the loss distribution function and VaR.
The VaR value at a high confidence level is the measure chosen in Basel II to calculate regulatory capital. This approximation, usually called Asymptotic
Single Risk Factor model (ASRF), works well for a large number of small exposures but can underestimates risks in the presence of exposure
concentrations. Then, the ASRF model does not provide an appropriate quantitative framework for the computation of economic capital. Monte Carlo
simulation is a standard method for measuring credit portfolio risk in order to deal with concentration risks. However, this method is very time consuming
when the size of the portfolio increases, making the computation unworkable in many situations. In summary, credit risk managers are interested in how
can concentration risk be quantified in short times and how can the contributions of individual transactions to the total risk be computed. Since the loss
variable can take only a finite number of discrete values, the cumulative distribution function (CDF) is discontinuous and then the Haar wavelets are
particularly well-suited for this stepped-shape functions. For this reason, we have developed a new method for numerically inverting the Laplace
transform of the density function, once we have approximated the CDF by a finite sum of Haar wavelet basis functions. Wavelets are used in
mathematical analysis to denote a kind of orthonormal basis with remarkable approximation properties. The difference between the usual sine wave and
a wavelet may be described by the localization property, while the sine wave is localized in frequency domain but not in time domain, a wavelet is
localized in both, frequency and time domain. Once the CDF has been computed, we are able to calculate the VaR at a high loss level. Furthermore, we
have computed also the Expected Shortfall (ES), since VaR is not a coherent risk measure in the sense that it is not sub-additive. We have shown that,
in a wide variety of portfolios, these measures are fast and accurately computed with a relative error lower than 1% when compared with Monte Carlo.
We have also extended this methodology to the estimation of the risk contributions to the VaR and the ES, by taking partial derivatives with respect to
the exposures, obtaining again high accuracy. Some technical improvements have also been implemented in the computation of the Gauss-Hermite
integration formula in order to get the coefficients of the approximation, making the method faster while the accuracy remains. Finally, we have extended
the wavelet approximation method to the multi-factor setting by means of Monte Carlo and quasi-Monte Carlo methods
CVA and vulnerable options pricing by correlation expansions
We consider the problem of computing the Credit Value Adjustment ({CVA}) of a
European option in presence of the Wrong Way Risk ({WWR}) in a default
intensity setting. Namely we model the asset price evolution as solution to a
linear equation that might depend on different stochastic factors and we
provide an approximate evaluation of the option's price, by exploiting a
correlation expansion approach, introduced in \cite{AS}. We compare the
numerical performance of such a method with that recently proposed by Brigo et
al. (\cite{BR18}, \cite{BRH18}) in the case of a call option driven by a GBM
correlated with the CIR default intensity. We additionally report some
numerical evaluations obtained by other methods.Comment: 21 page
Monetary policy in a non-representative agent economy: A survey
It is well-known that central bank policies affect not only macroeconomic aggregates, but also their distribution across economic agents. Similarly, a number of papers demonstrated that heterogeneity of agents may matter for the transmission of monetary policy on macro variables. Despite this, the mainstream monetary economics literature has so far been dominated by dynamic stochastic general equilibrium (DSGE) models with representative agents. This article aims to tilt this imbalance towards heterogeneous agents setups by surveying the main positive and normative findings of this line of the literature, and suggesting areas in which these models could be implemented. In particular, we review studies that analyze the heterogeneity of (i) households’ income, (ii) households’ preferences, (iii) consumers’ age, (iv) expectations, and (v) firms’ productivity and financial position. We highlight the results on issues that, by construction, cannot be investigated in a representative agent framework and discuss important papers modifying the findings from the representative agent literature.Heterogeneous Agents; Monetary Policy
Bayesian Model Choice of Grouped t-copula
One of the most popular copulas for modeling dependence structures is
t-copula. Recently the grouped t-copula was generalized to allow each group to
have one member only, so that a priori grouping is not required and the
dependence modeling is more flexible. This paper describes a Markov chain Monte
Carlo (MCMC) method under the Bayesian inference framework for estimating and
choosing t-copula models. Using historical data of foreign exchange (FX) rates
as a case study, we found that Bayesian model choice criteria overwhelmingly
favor the generalized t-copula. In addition, all the criteria also agree on the
second most likely model and these inferences are all consistent with classical
likelihood ratio tests. Finally, we demonstrate the impact of model choice on
the conditional Value-at-Risk for portfolios of six major FX rates
- …