32 research outputs found
Heuristic optimisation in financial modelling
There is a large number of optimisation problems in theoretical and applied finance that are difficult to solve as they exhibit multiple local optima or are not ‘well-behaved' in other ways (e.g., discontinuities in the objective function). One way to deal with such problems is to adjust and to simplify them, for instance by dropping constraints, until they can be solved with standard numerical methods. We argue that an alternative approach is the application of optimisation heuristics like Simulated Annealing or Genetic Algorithms. These methods have been shown to be capable of handling non-convex optimisation problems with all kinds of constraints. To motivate the use of such techniques in finance, we present several actual problems where classical methods fail. Next, several well-known heuristic techniques that may be deployed in such cases are described. Since such presentations are quite general, we then describe in some detail how a particular problem, portfolio selection, can be tackled by a particular heuristic method, Threshold Accepting. Finally, the stochastics of the solutions obtained from heuristics are discussed. We show, again for the example from portfolio selection, how this random character of the solutions can be exploited to inform the distribution of computation
View fusion vis-à-vis a Bayesian interpretation of Black–Litterman for portfolio allocation
The Black–Litterman model extends the framework of the Markowitz modern portfolio theory to incorporate investor views. The authors consider a case in which multiple view estimates, including uncertainties, are given for the same underlying subset of assets at a point in time. This motivates their consideration of data fusion techniques for combining information from multiple sources. In particular, they consider consistency-based methods that yield fused view and uncertainty pairs; such methods are not common to the quantitative finance literature. They show a relevant, modern case of incorporating machine learning model-derived view and uncertainty estimates, and the impact on portfolio allocation, with an example subsuming arbitrage pricing theory. Hence, they show the value of the Black– Litterman model in combination with information fusion and artificial intelligence–grounded prediction methods
Measuring risks in the financial services industry
Traditionally, financial risk management has mainly focused on the types of risk that can be identified and measured. Many actuarial and statistical theories and models have been developed in the past, to quantify such risks. However, high profile events such as Black Monday, the Asian financial crisis, 9/11 terrorist attacks, the Enron scandal, and more recently the Global financial crisis, has repeatedly proven to the financial world that risks which matter to the stability of financial firms are often immeasurable and unidentifiable. Hence, simply focusing on the measurable risks is inadequate for a sound management of financial risks. In this thesis, we develop a holistic framework to identify (if possible), measure (if possible), and manage the measurable, as well as the immeasurable, and the unidentifiable risks. We identify four realms of financial uncertainties and point out that each realm possesses a unique set of challenges to risk management. Possible tools to grapple each realm of uncertainty and their limitations are discussed by drawing from risk management techniques used in various fields of science and other industries. Then, those tools are applied to two financial problems: 1) estimating operational risk capital for banks by using external data, and 2) assessing the adequacy of the Australian superannuation guarantee system under market uncertainty
Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications
The monograph investigates the misapplication of conventional statistical
techniques to fat tailed distributions and looks for remedies, when possible.
Switching from thin tailed to fat tailed distributions requires more than
"changing the color of the dress". Traditional asymptotics deal mainly with
either n=1 or , and the real world is in between, under of the "laws
of the medium numbers" --which vary widely across specific distributions. Both
the law of large numbers and the generalized central limit mechanisms operate
in highly idiosyncratic ways outside the standard Gaussian or Levy-Stable
basins of convergence.
A few examples:
+ The sample mean is rarely in line with the population mean, with effect on
"naive empiricism", but can be sometimes be estimated via parametric methods.
+ The "empirical distribution" is rarely empirical.
+ Parameter uncertainty has compounding effects on statistical metrics.
+ Dimension reduction (principal components) fails.
+ Inequality estimators (GINI or quantile contributions) are not additive and
produce wrong results.
+ Many "biases" found in psychology become entirely rational under more
sophisticated probability distributions
+ Most of the failures of financial economics, econometrics, and behavioral
economics can be attributed to using the wrong distributions.
This book, the first volume of the Technical Incerto, weaves a narrative
around published journal articles
Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain
The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio