49 research outputs found

    Building a Better Fund of Hedge Funds: A Fractal and Alpha - Stable Distribution Approach

    Get PDF
    Markowitz’s (1952) portfolio theory has permeated financial institutions over the past 50 years. Assuming that returns are normally distributed, Markowitz suggests that portfolio optimization should be performed in a mean-variance framework. With the emergence of hedge funds and their non-normally distributed returns, mean-variance portfolio optimization is no longer adequate. Here, hedge fund returns are modeled with the alpha-stable distribution and a mean-CVaR portfolio optimization is performed. Results indicate that by using the alpha- stable distribution, a more efficient fund of hedge funds portfolio can be created than would be by assuming a normal distribution. To further increase efficiency, the Hurst exponent is considered as a filtering tool and it is found that combining hedge fund strategies within a range of Hurst exponents leads to the creation of more efficient portfolios as characterized by higher risk-adjusted ratios. These findings open the door for the further study of econophysics tools in the analysis of hedge fund returns.hedge funds, fund of funds, portfolio optimization, conditional value at risk, alpha-stable distribution, Hurst exponent, fractals

    A discussion of data enhancement and optimization techniques for a fund of hedge funds portfolio

    Get PDF
    Ziel dieser Arbeit ist es, einen Überblick über die verschiedenen Techniken zur Datenanreicherung und Optimierung im Falle eines Fund of Hedge Funds Portfolios darzustellen, zu diskutieren und anhand von Experimenten zu illustrieren. Besonderes Augenmerk liegt dabei auch auf der Bewertung des Zusammenspiels der verschiedenen Datenanreicherungs- und Optimierungs-techniken. Erste Bausteine für ein integriertes computergestütztes Anwendungstool werden bereitgestellt und dokumentiert. Zudem werden Ideen für weitere Entwicklungen und Forschung vorgestellt. Zwei wesentliche Punkte unterscheiden diese Arbeit von ähnlichen, nämlich dass sie hauptsächlich auf Fund Level arbeitet und dass sie den gesamten Prozess, beginnend mit der Datenaufbereitung, über die Optimierung bis zur sachgerechten Evaluierung der Ergebnisse behandelt. Im ersten Teil wird das Thema im Kontext der Finanzwirtschaft verortet, der Begriff Hedge Fund definiert und die Relevanz der Aufgabenstellung erörtert. Neben dem schnellen Wachstum der Hedge Fund Industrie ist besonders das zunehmende Interesse von institutionellen Investoren ein wichtiger Grund quantitative, auf wissenschaftlichen Erkenntnissen aufbauende Methoden zur Unterstützung der Entscheidungsfindung bei der Auswahl von Hedge Funds bereitzustellen. Der zweite Teil beschäftigt sich mit der Frage der Datenaufbereitung. Generell gilt, dass der Output eines Optimierungs Algorithmus nur so gut sein kann, wie die Qualität der Input Daten mit denen er gefüttert wird. Dies trifft insbesondere auch auf den Fall von Hedge Funds zu, da die Datenlage hier als eher schwierig zu bezeichnen ist: Es werden nur monatliche Renditezahlen zur Verfügung gestellt und Informationen über Risiko Exposures sind nur schwer zu erhalten. Nachdem ein kurzer Literaturüberblick über die Hedge Fund spezifischen Datenprobleme und Verzerrungen gegeben wird werden die verwendeten Datenbanken anhand von einigen deskriptiven Merkmalen beschrieben. Besonderes Augenmerkt wird bei der Datenaufbereitung der hohen Autokorrelation in den Hedge Fund Renditen und dem Auffüllen kurzer Performance Zeitreihen gewidmet. Ersteres weil eine hohe Autokorrelation fundamentalen Prinzipien der modernen Finanzwirtschaft widerspricht, zweiteres weil es zu einer besseren Einschätzung des Risikoprofils der betrachteten Hedge Funds führt. Zum Zwecke der Datenauffüllung werden im Einzelnen Ansätze über Faktormodelle und Clusteranalyse besprochen. Nach einer Übersicht über die in der Literatur vorgeschlagenen Risikofaktoren wird ein zentraler Gesichtspunkt, nämlich ist die Modellierung von nichtlinearen Zusammenhängen z.B. über Optionsstrukturen, genauer beleuchtet. Wichtige eigene Beiträge in diesem Kapitel sind die ökonomische Interpretation und Motivation des favorisierten Optionsstrukturmodells sowie Vorschläge und erste Experimente zur automatischen Modellselektion und zur Einbindung qualitativer Daten via Clusteranalyse. Der dritte Teil ist der Optimierung gewidmet. Die Hauptherausforderung ergibt sich aus der Tatsache, dass die Renditen von Hedge Fund Investments meist nicht normalverteilt sind. Da die traditionellen Konzepte der Finanzwirtschaft aber genau auf der Annahme von normalverteilten Renditen aufbauen, müssen alternative Konzepte angewandt werden. Nach einem kurzen Überblick über die klassische Mean-Variance Optimierung und Möglichkeiten robustere Ergebnisse zu bekommen, werden im Wesentlichen zwei Arten vorgestellt wie mit nicht normalverteilten Renditen umgegangen werden kann: parametrische Ansätze, die die höheren Momente (Schiefe und Kurtosis) der Verteilung berücksichtigen und nichtparametrische Ansätze, die mit historischen oder simulierten Szenarien und den sich daraus ergebenden diskreten Verteilungen arbeiten. Die Präferenzen des Investors können dabei über ein Dispersions- oder ein Quantilsmaß oder einer Kombination aus beidem erfasst werden. Danach werden Überlegungen angestellt wie einfache lineare und komplexere logische Nebenbedingungen eingesetzt und wie die vorgestellten Konzepte integriert werden können, speziell welche Datenaufbereitungstechniken mit welchen Optimierungsverfahren zusammenpassen. Im letzten Kapitel von Teil drei werden aufwendige Optimierungsexperimente durchgeführt und die neu gewonnen Erkenntnisse interpretiert. Die zentralen Erkenntnisse sind, dass die Wahl des Risikomaßes kaum Einfluss auf das letztinstanzliches Bewertungskriterium, die risikoadjustierte Out-Of-Sample Performance, hat und dass das Auffüllen von kurzen Performance Zeitreihen das Risiko Out-Of-Sample signifikant verbessert. Abschließend werden die Ergebnisse zusammengefasst und ein Ausblick auf zukünftige Forschungsarbeit gegeben.The aim of this thesis is to provide an overview and brief discussion, including some experiments, of techniques for data enhancement and optimization techniques for a fund of hedge funds. Special emphasis is placed on the interaction of the different data enhancement and optimization techniques. First building blocks for a computer based asset allocation tool are provided and documented. In addition it provides some ideas about future development and research. The two main points that distinguish this thesis from papers that treat a similar theme are that it operates on individual fund level and that it covers the whole process beginning with questions of data enhancement and parameter estimation up to proper evaluation of the outcomes. In the first chapter the theme is put in a broader context of finance, the term “hedge fund” gets defined and the relevance of the problem is reasoned. Besides the rapid growth rates in hedge fund industry the fact that more and more institutional investors invest in hedge funds is an important reason to provide decision support methods based on quantitative models and scientific findings. The second chapter deals with data enhancement. In general the proverb “garbage in – garbage out” holds true for every optimization algorithm, but it is especially true in the case of hedge funds as the data situation is very difficult in this field: only monthly data is provided and there is only little information about risk exposures. After a short literature overview about hedge fund specific data problems and biases descriptive statistics are provided for the two databases used in this thesis. With the data enhancement special emphasis is put on the high autocorrelation in hedge fund returns and on filling up track records of funds that are alive for a short time. The former because high autocorrelation is contradictory to fundamental principles of modern finance, the latter because it leads to a better understanding of a funds risk profile. For the purpose of filling up track records, factor model approaches and the use of cluster analysis are proposed. After a short literature overview about the risk factors considered in literature, the modeling of non linear dependencies, for example via option structures, is discussed on a broader basis as this topic is central in this thesis. Important own contributions in this context are the motivation and economic interpretation of the favored option structure model and some first experiments on automatic model selection and on integrating qualitative data via cluster analysis. The third chapter talks about optimization. The main challenge is the fact that hedge fund returns are not normally distributed. But as traditional concepts are based exactly on the assumption of normally distributed returns alternative concepts have to be used. After a short overview of classical mean-variance optimization and possibilities to get more robust outcomes, essentially two alternative concepts are introduced: parametrical approaches, that take higher moments (skewness and kurtosis) into account, and non parametrical approaches, that work with historical or simulated scenarios and with the discrete distributions resulting of these scenarios. With the second approach the preferences of an investor can be captured via a dispersion- or a quantile measure, or a combination of both. Then, different ways how linear and more complex logical constraints can be used are considered, and procedures how to integrate the concepts presented are discussed, especially which data enhancement and which optimization technique may fit together. In the last part of chapter 3 extensive optimization experiments are conducted and the outcome interpreted. The central findings are that the choice of the risk measure has no significant impact on the out of sample performance, which is the ultimate evaluation criterion; Filling up short track records on the other hand significantly improves the out of sample risk. Finally the findings are summarized and an outlook for future research is given

    Copula-based multivariate input modeling

    Get PDF
    In this survey, we review the copula-based input models that are well suited to provide multivariate input-modeling support for stochastic simulations with dependent inputs. Specifically, we consider the situation in which the dependence between pairs of simulation input random variables is measured by tail dependence (i.e., the amount of dependence in the tails of a bivariate distribution) and review the techniques to construct copula-based input models representing positive tail dependencies. We complement the review with the parameter estimation from multivariate input data and the random-vector generation from the estimated input model with the purpose of driving the simulation. © 2012 Elsevier Ltd

    Quantifying Impact of Cyber Actions on Missions or Business Processes: A Multilayer Propagative Approach

    Get PDF
    Ensuring the security of cyberspace is one of the most significant challenges of the modern world because of its complexity. As the cyber environment is getting more integrated with the real world, the direct impact of cybersecurity problems on actual business frequently occur. Therefore, operational and strategic decision makers in particular need to understand the cyber environment and its potential impact on business. Cyber risk has become a top agenda item for businesses all over the world and is listed as one of the most serious global risks with significant financial implications for businesses. Risk analysis is one of the primary tools used in this endeavor. Impact assessment, as an integral part of risk analysis, tries to estimate the possible damage of a cyber threat on business. It provides the main insight into risk prioritization as it incorporates business requirements into risk analysis for a better balance of security and usability. Moreover, impact assessment constitutes the main body of information flow between technical people and business leaders. Therefore, it requires the effective synergy of technological and business aspects of cybersecurity for protection against cyber threats. The purpose of this research is to develop a methodology to quantify the impact of cybersecurity events, incidents, and threats. The developed method addresses the issue of impact quantification from an interdependent system of systems point of view. The objectives of this research are (1) developing a quantitative model to determine the impact propagation within a layer of an enterprise (i.e., asset, service or business process layer); (2) developing a quantitative model to determine the impact propagation among different layers within an enterprise; (3) developing an approach to estimate the economic cost of a cyber incident or event. Although there are various studies in cybersecurity risk quantification, only a few studies focus on impact assessment at the business process layer by considering ripple effects at both the horizontal and vertical layers. This research develops an approach that quantifies the economic impact of cyber incidents, events and threats to business processes by considering the horizontal and vertical interdependencies and impact propagation within and among layers

    A method for treating dependencies between variables in a simulation risk analysis model

    Get PDF
    This thesis explores the need to recognise and represent accurately the interdependencies between uncertain quantitative components in a simulation model. Therefore, helping to fill the gap between acknowledging the importance of modelling correlation and the actual specification and implementation of a procedure for modelling accurate measures of Pearson's correlation became the main aim of this research. Two principal objectives are stated for the developed Research Correlation Model ("RCM"): (1) it is to generate Pearson-correlated paired samples of two continuous variables for which the sample correlation is a good approximation to the target correlation; and (2) the sampled values of the two individual variables must have very accurate means and variances. The research results conclude that the samples from the four chosen distributions that have been generated by the RCM have highly acceptable levels of precision when tested using x2 tests and others. The results also show that an average improvement in precision of correlation modelling was over 96 percent. Even with samples as small as 10 the worst case correction factor is only just less than 90 percent, with the average correction factor being over 96 percent overall, so that the contribution made by the RCM here is quite impressive. Overall the analysis shows that in the case when the sample size is 10, the RCM consistently generates samples whose correlation is so much more precise than that generated by @RISK. The smallest of all the observed ratios of improvements of the RCM in comparison with the use of @RISK is 2.3:1, in just one case when the medians were being compared. The average improvement ratio exceeded 100. It is concluded that the aim of specifying, formulating and developing a Pearson correlation model between a pair of continuous variables which can be incorporated into simulation models of complex applications has been achieved successfully

    Modeling cost and time uncertainty in rail line construction

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 437-443).Transportation construction projects are often plagued by cost overruns and delays. Technical, economic-political, psychological, and legal causes explain the frequent underestimations. To counteract such underestimations, the author developed an innovative approach to capture cost and time uncertainty in rail line projects, and applied this to the construction of a new high speed rail line in Portugal. The construction of the four main types of structures in rail lines (tunnels, viaducts, cuts and embankments) is modeled bottom-up from the single activity to the entire rail line. Sub-networks of activities are combined in structure networks to model the rail line structures; in turn, the structure networks are organized in the construction network to represent the rail line. For the first time, three sources of uncertainty (variability in the construction process, correlations between the costs of repeated activities, and disruptive events) are modeled jointly at the level of the single activity. These uncertainties are propagated to the total construction cost and time through the combination of the individual activity costs and times. The Construction and Uncertainty Models are integrated in the Decision Aids for Tunneling (DAT), which have been extended beyond tunneling to consider different structures and different uncertainty types. Based on historical input data and expert estimations, the cost and time uncertainty in the construction of four alignments of the new Portuguese high speed rail line is simulated. The three sources of uncertainty cause different cost and time impacts depending on the type of structure suggesting structure specific mitigation measures. Most importantly, their cumulative impact causes significant increases in construction cost and time compared to the deterministic estimates: 58% in the construction cost of tunnels, and 94% in the construction time of cuts and embankments. The Construction and Uncertainty Models and their integrated implementation in the DAT provide transportation agencies with a modeling tool to tackle cost and time uncertainty in the construction of rail lines and other linear/networked infrastructure projects.by Yvonne Moret.Ph.D

    The development of the personal loan department in commercial banks

    Full text link
    This item was digitized by the Internet Archive

    Applying Blockchain Technology to Financial Market’s Infrastructure

    Get PDF
    The utilization of blockchain technology has gained widespread acceptance across various domains in recent years. Among them, blockchain integration in the financial sector is particularly noteworthy. Blockchain technology offers a range of features that can address various challenges in the financial industry, including decentralization, transparency, enhanced security, and tamper-proofing. Therefore, this thesis aims to investigate the issues that persist in academia and industry and address them through blockchain technology. The research for this thesis was divided into three major stages. The first stage involved conducting an academic survey through a comprehensive literature review. The aim was to identify the pain points that academics have identified and to narrow down the problems that concern the academic community. The second stage involved collecting requirements from industry experts. This helped to identify the real-world issues that currently exist in the financial industry. Based on these issues, the research moved on to the next stage. The third stage involved an experimental study, further divided into two parts. Part 1 involved designing and developing a blockchain-based issuance and trading system for financial products. This system aimed to enhance participant trust, reduce costs, and increase efficiency. Part 2 involved the development of a risk monitoring system for blockchain-based financial products. This system aimed to assist participants in monitoring market risks, providing them with risk warning coefficients, and reducing the probability of systemic risks in the market. The results of this thesis demonstrate that blockchain technology's feasibility and integration can positively impact financial markets from an experimental perspective. It can be helpful to adopt blockchain technology for financial and FinTech industries

    Institutional factors of economic changes: development trajectories of Douglas North’s theory

    Get PDF
    The present article considers the contribution of Douglas North to general institutional economic theory with a special emphasis on the research of institutional factors of economic changes. It has been revealed how agents’ interaction models are formed and how institutions influence the agents’ behavior and change their models. It is especially important, that D. North's conclusions and their development in modern economic theory result in the emergence of the so-called institutional macroeconomics and being a scientific discipline it explains economic changes on long intervals of evolution. The main emphasis in the article is put on two D. North’s works: “Institutions, Institutional Changes and Economic Performance” (1990) and “Understanding the Process of Economic Change” (2005). The advantages and possible disadvantages of the new institutional approach by D. North that is devoted to a long interval of social evolution have been demonstrated. In particular the formal informal institutions relations have been discussed as well as compulsion mechanisms that according to D. North belong to special institutions that play a significant role in the description of institutional changes at long time intervals have been considered. The role of a government and government regulation mechanisms have been studied based on the compulsion mechanisms to observe the rules and norms in economy. We study the connection between transaction and transformation costs that are conventionally considered independently in the frameworks of a new modern institutionalism. The authors’ attitude to technologies as special and rather stable institutions at particular time periods has been proven. This attitude is opposite to the view that institutions and technologies are interconnection factors of institutional changes according to D. North. From the authors’ point of view the model approach by D. North provides restricted representation of institutional changes of complex social and economic model. Moreover, there is a redistribution of weights of change factors and D. North’s theory is not able to determine the regularities in the weights change. In its turn the introduction of the trust factor by G. Akerlof and R. Shiller is not a sufficient decision as different social institutions have macroeconomic importance. These institutions predetermine economy development and trust forms that are formed between some agents. For example, technologies are such institutions and processability of economy as a system predetermines its economic dynamics and demand for the change of some institutions into the others-institutional changes regime. Thus the task under consideration is difficult and has not been solved in economics yet. The opportunity to solve the above mentioned challenges using the postulates of “institutional macroeconomics” as the scientific analysis branch has been founded
    corecore