272 research outputs found

    Mean-CVAR portfolio : a mixture-copula approach

    Get PDF
    The current study aims to conduct a comparative analysis of portfolio optimization techniques in the context of financial applications. The proposed approach involves the use of mixture-copulas as an alternative to mitigate the inherent risks of investments associated with investments in the stock market, particularly during times of economic crisis. To conduct this research, data from 19 country index ETFs were sourced from Historical Market Data - Stooq, spanning the period from 2013 to 2023. The study employed MeanCVaR portfolio optimization, and the structural dependence between assets was modeled using a mixture of copulas (specifically Clayton-t-Gumbel), with marginal adjusted by an AR(1)-GARCH(1,1) model. The results of simulations based on this strategy were compared with two other benchmark portfolios, including those using Gaussian copulas and equally weighted portfolios, across three distinct time horizons: one, two, and five years. Portfolios generated through simulated returns using the mixture-copulas technique demonstrated superior risk-return performance when contrasted with the benchmark portfolios. Simultaneously, a reduction in financial losses was observed, with equivalent or superior returns in the comparison, particularly over longer time periods where the estimates were more accurate.A presente pesquisa tem como objetivo analisar comparativamente técnicas de otimização de portfólio no contexto de aplicações financeiras. A abordagem de mistura de cópulas é proposta como uma alternativa para mitigar os riscos inerentes aos investimentos realizados na Bolsa de Valores, principalmente em momentos de crise. Para desenvolver a pesquisa, foram utilizados dados de preços de 19 ETFs de índices de países, provenientes do Historical Market Data - Stooq, que abrangem o período de 2013 a 2023. Foi empregada uma otimização de portfólio Média-CVaR, e a dependência estrutural entre os ativos foi modelada usando uma mistura de cópulas (Clayton-t-Gumbel), cujas marginais foram ajustadas por um modelo AR(1)-GARCH(1,1). Os resultados das simulações feitas a partir dessa estratégia foram comparados com outros dois portfólios de referência de técnicas mais simples, usando cópulas Gaussianas e igualmente ponderado, em três janelas de tempo: um, dois e cinco anos. As carteiras geradas a partir dos retornos simulados com a técnica de mistura de cópulas apresentaram melhores desempenhos em termos de risco-retorno quando comparada aos portfólios de referência. Ao mesmo tempo, notou-se uma redução das perdas financeiras, inclusive retornos iguais ou superiores na comparação, especialmente nas janelas de tempo maiores, nas quais as estimativas foram mais precisas

    The GARCH-EVT-Copula model and simulation in scenario-based asset allocation

    Get PDF
    Financial market integration, in particular, portfolio allocations from advanced economies to South African markets, continues to strengthen volatility linkages and quicken volatility transmissions between participating markets. Largely as a result, South African portfolios are net recipients of returns and volatility shocks emanating from major world markets. In light of these, and other, sources of risk, this dissertation proposes a methodology to improve risk management systems in funds by building a contemporary asset allocation framework that offers practitioners an opportunity to explicitly model combinations of hypothesised global risks and the effects on their investments. The framework models portfolio return variables and their key risk driver variables separately and then joins them to model their combined dependence structure. The separate modelling of univariate and multivariate (MV) components admits the benefit of capturing the data generating processes with improved accuracy. Univariate variables were modelled using ARMA-GARCH-family structures paired with a variety of skewed and leptokurtic conditional distributions. Model residuals were fit using the Peaks-over-Threshold method from Extreme Value Theory for the tails and a non-parametric, kernel density for the interior, forming a completed semi-parametric distribution (SPD) for each variable. Asset and risk factor returns were then combined and their dependence structure jointly modelled with a MV Student t copula. Finally, the SPD margins and Student t copula were used to construct a MV meta t distribution. Monte Carlo simulations were generated from the fitted MV meta t distribution on which an out-of-sample test was conducted. The 2014-to-2015 horizon served to proxy as an out-of-sample, forward-looking scenario for a set of key risk factors against which a hypothetical, diversified portfolio was optimised. Traditional mean-variance and contemporary mean-CVaR optimisation techniques were used and their results compared. As an addendum, performance over the in-sample 2008 financial crisis was reported. The final Objective (7) addressed management and conservation strategies for the NMBM. The NMBM wetland database that was produced during this research is currently being used by the Municipality and will be added to the latest National Wetland Map. From the database, and tools developed in this research, approximately 90 wetlands have been identified as being highly vulnerable due to anthropogenic and environmental factors (Chapter 6) and should be earmarked as key conservation priority areas. Based on field experience and data collected, this study has also made conservation and rehabilitation recommendations for eight locations. Recommendations are also provided for six more wetland systems (or regions) that should be prioritised for further research, as these systems lack fundamental information on where the threat of anthropogenic activities affecting them is greatest. This study has made a significant contribution to understanding the underlying geomorphological processes in depressions, seeps and wetland flats. The desktop mapping component of this study illustrated the dominance of wetlands in the wetter parts of the Municipality. Perched wetland systems were identified in the field, on shallow bedrock, calcrete or clay. The prevalence of these perches in depressions, seeps and wetland flats also highlighted the importance of rainfall in driving wetland formation, by allowing water to pool on these perches, in the NMBM. These perches are likely to be a key factor in the high number of small, ephemeral wetlands that were observed in the study area, compared to other semi-arid regions. Therefore, this research highlights the value of multi-faceted and multi-scalar wetland research and how similar approaches should be used in future research methods has been highlighted. The approach used, along with the tools/methods developed in this study have facilitated the establishment of priority areas for conservation and management within the NMBM. Furthermore, the research approach has revealed emergent wetland properties that are only apparent when looking at different spatial scales. This research has highlighted the complex biological and geomorphological interactions between wetlands that operate over various spatial and temporal scales. As such, wetland management should occur across a wetland complex, rather than individual sites, to account for these multi-scalar influences

    Hedge fund return predictability; To combine forecasts or combine information?

    Get PDF
    While the majority of the predictability literature has been devoted to the predictability of traditional asset classes, the literature on the predictability of hedge fund returns is quite scanty. We focus on assessing the out-of-sample predictability of hedge fund strategies by employing an extensive list of predictors. Aiming at reducing uncertainty risk associated with a single predictor model, we first engage into combining the individual forecasts. We consider various combining methods ranging from simple averaging schemes to more sophisticated ones, such as discounting forecast errors, cluster combining and principal components combining. Our second approach combines information of the predictors and applies kitchen sink, bootstrap aggregating (bagging), lasso, ridge and elastic net specifications. Our statistical and economic evaluation findings point to the superiority of simple combination methods. We also provide evidence on the use of hedge fund return forecasts for hedge fund risk measurement and portfolio allocation. Dynamically constructing portfolios based on the combination forecasts of hedge funds returns leads to considerably improved portfolio performance

    Distributionally robust optimization with applications to risk management

    No full text
    Many decision problems can be formulated as mathematical optimization models. While deterministic optimization problems include only known parameters, real-life decision problems almost invariably involve parameters that are subject to uncertainty. Failure to take this uncertainty under consideration may yield decisions which can lead to unexpected or even catastrophic results if certain scenarios are realized. While stochastic programming is a sound approach to decision making under uncertainty, it assumes that the decision maker has complete knowledge about the probability distribution that governs the uncertain parameters. This assumption is usually unjustified as, for most realistic problems, the probability distribution must be estimated from historical data and is therefore itself uncertain. Failure to take this distributional modeling risk into account can result in unduly optimistic risk assessment and suboptimal decisions. Furthermore, for most distributions, stochastic programs involving chance constraints cannot be solved using polynomial-time algorithms. In contrast to stochastic programming, distributionally robust optimization explicitly accounts for distributional uncertainty. In this framework, it is assumed that the decision maker has access to only partial distributional information, such as the first- and second-order moments as well as the support. Subsequently, the problem is solved under the worst-case distribution that complies with this partial information. This worst-case approach effectively immunizes the problem against distributional modeling risk. The objective of this thesis is to investigate how robust optimization techniques can be used for quantitative risk management. In particular, we study how the risk of large-scale derivative portfolios can be computed as well as minimized, while making minimal assumptions about the probability distribution of the underlying asset returns. Our interest in derivative portfolios stems from the fact that careless investment in derivatives can yield large losses or even bankruptcy. We show that by employing robust optimization techniques we are able to capture the substantial risks involved in derivative investments. Furthermore, we investigate how distributionally robust chance constrained programs can be reformulated or approximated as tractable optimization problems. Throughout the thesis, we aim to derive tractable models that are scalable to industrial-size problems

    Tail Risk Hedging and Regime Switching

    Get PDF
    In this paper, we analyze futures-based hedging strategies which minimize tail risk measured by Value-at-Risk (VaR) and Conditional-Value-at-Risk (CVaR). In par- ticular, we first deduce general characterizations of VaR- and CVaR-minimal hedging policies from results on quantile derivatives. We then derive first-order conditions for tail-risk-minimal hedging in mixture and regime-switching (RS) models. Using cross hedging examples, we show that CVaR-minimal hedging can noticeably deviate from standard minimum-variance hedging if the return data exhibit nonelliptical features. In our examples, we find an increase in hedging amounts if RS models identify a joint crash scenario and we confirm a reduction in tail risk using empirical and EVT-based risk estimators. These results imply that switching from minimum-variance to CVaR- minimal hedging can cut losses during financial crises and reduce capital requirements for institutional investors

    Asset allocation: analysis of theory and practice in the Australian investment management industry

    Get PDF
    Asset allocation is the decision on how much of the investment portfolio to place in each of the broad asset classes (e.g. cash, fixed interest securities, property, equities). It is a key decision area in the investment management industry, where professional investors manage pooled investments. The present research aims to examine any dichotomy between theory and practice of asset allocation in the Australian investment management industry. Studying asset allocation theory and practice in relation to one another may lead to finding ways to improve both. The present research identifies gaps between theory and practice and the reasons for their existence and make recommendations that may help reduce the gap. It surveys the available body of research on Modern Portfolio Theory from the seminal Markowitz mean-variance formulation to subsequent research strands. The present research utilises a combination of qualitative and quantitative methods to examine the level of awareness and usage of asset allocation theories and theory-based methods among investment management industry practitioners

    Survey of quantitative investment strategies in the Russian stock market : Special interest in tactical asset allocation

    Get PDF
    Russia’s financial markets have been an uncharted area when it comes to exploring the performance of investment strategies based on modern portfolio theory. In this thesis, we focus on the country’s stock market and study whether profitable investments can be made while at the same time taking uncertainties, risks, and dependencies into account. We also pay particular interest in tactical asset allocation. The benefit of this approach is that we can utilize time series forecasting methods to produce trading signals in addition to optimization methods. We use two datasets in our empirical applications. The first one consists of nine sectoral indices covering the period from 2008 to 2017, and the other includes altogether 42 stocks listed on the Moscow Exchange covering the years 2011 – 2017. The strategies considered have been divided into five sections. In the first part, we study classical and robust mean-risk portfolios and the modeling of transaction costs. We find that the expected return should be maximized per unit expected shortfall while simultaneously requiring that each asset contributes equally to the portfolio’s tail risk. Secondly, we show that using robust covariance estimators can improve the risk-adjusted returns of minimum variance portfolios. Thirdly, we note that robust optimization techniques are best suited for conservative investors due to the low volatility allocations they produce. In the second part, we employ statistical factor models to estimate higher-order comoments and demonstrate the benefit of the proposed method in constructing risk-optimal and expected utility-maximizing portfolios. In the third part, we utilize the Almgren–Chriss framework and sort the expected returns according to the assumed momentum anomaly. We discover that this method produces stable allocations performing exceptionally well in the market upturn. In the fourth part, we show that forecasts produced by VECM and GARCH models can be used profitably in optimizations based on the Black–Litterman, copula opinion pooling, and entropy pooling models. In the final part, we develop a wealth protection strategy capable of timing market changes thanks to the return predictions based on an ARIMA model. Therefore, it can be stated that it has been possible to make safe and profitable investments in the Russian stock market even when reasonable transaction costs have been taken into account. We also argue that market inefficiencies could have been exploited by structuring statistical arbitrage and other tactical asset allocation-related strategies.Venäjän rahoitusmarkkinat ovat olleet kartoittamatonta aluetta tutkittaessa moderniin portfolioteoriaan pohjautuvien sijoitusstrategioiden käyttäytymistä. Tässä tutkielmassa keskitymme maan osakemarkkinoihin ja tarkastelemme, voidaanko taloudellisesti kannattavia sijoituksia tehdä otettaessa samalla huomioon epävarmuudet, riskit ja riippuvuudet. Kiinnitämme erityistä huomiota myös taktiseen varojen kohdentamiseen. Tämän lähestymistavan etuna on, että optimointimenetelmien lisäksi voimme hyödyntää aikasarjaennustamisen menetelmiä kaupankäyntisignaalien tuottamiseksi. Empiirisissä sovelluksissa käytämme kahta data-aineistoa. Ensimmäinen koostuu yhdeksästä teollisuusindeksistä kattaen ajanjakson 2008–2017, ja toinen sisältää 42 Moskovan pörssiin listattua osaketta kattaen vuodet 2011–2017. Tarkasteltavat strategiat on puolestaan jaoteltu viiteen osioon. Ensimmäisessä osassa tarkastelemme klassisia ja robusteja riski-tuotto -portfolioita sekä kaupankäyntikustannusten mallintamista. Havaitsemme, että odotettua tuottoa on syytä maksimoida suhteessa odotettuun vajeeseen edellyttäen samalla, että jokainen osake lisää sijoitussalkun häntäriskiä yhtä suurella osuudella. Toiseksi osoitamme, että minimivarianssiportfolioiden riskikorjattuja tuottoja voidaan parantaa robusteilla kovarianssiestimaattoreilla. Kolmanneksi toteamme robustien optimointitekniikoiden soveltuvan parhaiten konservatiivisille sijoittajille niiden tuottamien matalan volatiliteetin allokaatioiden ansiosta. Toisessa osassa hyödynnämme tilastollisia faktorimalleja korkeampien yhteismomenttien estimoinnissa ja havainnollistamme ehdotetun metodin hyödyllisyyttä riskioptimaalisten sekä odotettua hyötyä maksimoivien salkkujen rakentamisessa. Kolmannessa osassa käytämme Almgren–Chrissin viitekehystä ja asetamme odotetut tuotot suuruusjärjestykseen oletetun momentum-anomalian mukaisesti. Havaitsemme, että menetelmä tuottaa vakaita allokaatioita menestyen erityisen hyvin noususuhdanteessa. Neljännessä osassa osoitamme, että VECM- että GARCH-mallien tuottamia ennusteita voidaan hyödyntää kannattavasti niin Black–Littermanin malliin kuin kopulanäkemysten ja entropian poolaukseenkin perustuvissa optimoinneissa. Viimeisessä osassa laadimme varallisuuden suojausstrategian, joka kykenee ajoittamaan markkinoiden muutoksia ARIMA-malliin perustuvien tuottoennusteiden ansiosta. Voidaan siis todeta, että Venäjän osakemarkkinoilla on ollut mahdollista tehdä turvallisia ja tuottavia sijoituksia myös silloin kun kohtuulliset kaupankäyntikustannukset on huomioitu. Toiseksi väitämme, että markkinoiden tehottomuutta on voitu hyödyntää suunnittelemalla tilastolliseen arbitraasiin ja muihin taktiseen varojen allokointiin pohjautuvia strategioita

    Decision Sciences, Economics, Finance, Business, Computing, and Big Data: Connections

    Get PDF
    This paper provides a review of some connecting literature in Decision Sciences, Economics, Finance, Business, Computing, and Big Data. We then discuss some research that is related to the six cognate disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models. Moreover, they could then conduct simulations to examine whether the estimators or statistics in the new theories on estimation and hypothesis have small size and high power. Thereafter, academics and practitioners could then apply their theories to analyze interesting problems and issues in the six disciplines and other cognate areas
    corecore