3,432 research outputs found

    Using Medians in Portfolio Optimization

    Get PDF
    Abstract: This paper formulates a number of new portfolio optimization models by adopting the sample median instead of the sample mean as the efficiency measure. The reasoning behind this is that the median is a robust statistic, which is less affected by outliers than the mean. In portfolio models this is particularly relevant as data is often characterized by attributes such as skewness, fat tails and jumps that are incompatible with the normality assumption. Here, we demonstrate that median portfolio models have a greater level of diversification than mean portfolios, and that, when tested on real financial data, they give better results in terms of risk calculation and concrete profits

    A Closed-Form Solution of the Multi-Period Portfolio Choice Problem for a Quadratic Utility Function

    Full text link
    In the present paper, we derive a closed-form solution of the multi-period portfolio choice problem for a quadratic utility function with and without a riskless asset. All results are derived under weak conditions on the asset returns. No assumption on the correlation structure between different time points is needed and no assumption on the distribution is imposed. All expressions are presented in terms of the conditional mean vectors and the conditional covariance matrices. If the multivariate process of the asset returns is independent it is shown that in the case without a riskless asset the solution is presented as a sequence of optimal portfolio weights obtained by solving the single-period Markowitz optimization problem. The process dynamics are included only in the shape parameter of the utility function. If a riskless asset is present then the multi-period optimal portfolio weights are proportional to the single-period solutions multiplied by time-varying constants which are depending on the process dynamics. Remarkably, in the case of a portfolio selection with the tangency portfolio the multi-period solution coincides with the sequence of the simple-period solutions. Finally, we compare the suggested strategies with existing multi-period portfolio allocation methods for real data.Comment: 38 pages, 9 figures, 3 tables, changes: VAR(1)-CCC-GARCH(1,1) process dynamics and the analysis of increasing horizon are included in the simulation study, under revision in Annals of Operations Researc

    Combination of multivariate volatility forecasts

    Get PDF
    This paper proposes a novel approach to the combination of conditional covariance matrix forecasts based on the use of the Generalized Method of Moments (GMM). It is shown how the procedure can be generalized to deal with large dimensional systems by means of a two-step strategy. The finite sample properties of the GMM estimator of the combination weights are investigated by Monte Carlo simulations. Finally, in order to give an appraisal of the economic implications of the combined volatility predictor, the results of an application to tactical asset allocation are presented.Multivariate GARCH, Forecast Combination, GMM, Portfolio Optimization

    MěƙenĂ­ averze ke ztrĂĄtě soukromĂ©ho investora

    Get PDF
    Purpose of the article: This paper gives an empirical view on behaviorance of private investor who is loss averse and whether a loss aversive private investor should invest into such risky assets as equity? The main focus is on the use of robust statistical methods and prospect theory for estimation of equity indexes’ selected characteristics, mainly risk characteristics. The paper contains a detail discussion, which one risk metric for assets seems suitable for private investor who is loss averse. Scientific aim of this article: The aim of the article is a critically describe the problems related with private investor’s loss aversion behaviorance and how the concept of loss aversion should by applied into equities (or equity indices) investment. The crucial problem is how to measure loss aversion of private investor investing in equities. Methodology/methods: The primary and secondary research was applied. Selected scientific articles and other literature published with the topic of prospect theory and risk measurement are mainly used to support a critical analyse of how private investor’s loss aversion should be define and measured in the reality – in the financial/investment area. Next the primary research was done with selected equity indexes. As the representants of equity indexes were chosen not only “typical” representative as MSCI World index but mainly some derivatives of indexes which track a dividend strategy (indexes comprising stocks of companies that pay dividends). Findings: Loss aversive investor worries about any loss of value of their wealth. If these investors choose to invest in stocks they should prefer to invest in the stock indexes with down-side risk close to zero, respectively those indexes whose down-side risk is lowest among all. This down-risk should by measure with using belowtarget semivariance. A standard deviation method as a tool for measurement of risk for loss aversive investor is not so proper due the fact that large positive outcomes are treated as equally risky as large negative ones. In practice, however, positive outliers should be regarded as a bonus and not as a risk. Conclusions: A loss averse investors should some part of his/her wealth invest into equity indexes (may be 15%, max.25%). As the best equity index for a loss adverse investor was chosen Natural Monopoly Index 30 Infrastructure Global with the smallest down side risk

    An Integrated Model for Liquidity Management and Short-Term Asset Allocation in Commercial Banks

    Get PDF
    This work develops an integrated model for optimal asset allocation in commercial banks that incorporates uncertain liquidity constraints that are currently ignored by RAROC and EVA models. While the economic profit accounts for the opportunity cost of risky assets, what may even incorporate a market liquidity premium, it neglects the risk of failure due to the lack of sufficient funds to cope with unexpected cash demands arising from bank runs, drawdowns, or market, credit and operational losses, what may happen along with credit rationing episodes or systemic level dry ups. Given a liquidity constraint that can incorporate these factors, there is a failure probability Pf that the constraint will not hold, resulting in a value loss for the bank, represented by a stochastic failure loss . By assuming that bankers are risk neutral in their decision about the size of the liquidity cushion, the economic profit less the possible losses due to the lack of liquidity is optimized, resulting in a short-term asset allocation model that integrates market, credit and operational risks in the liquidity management of banks. Even though a general approach is suggested through simulation, I provide a closed form solution for Pf , under some simplifying assumptions, that may be useful for research and supervision purposes as an indicator of the liquidity management adequacy in the banking system. I also suggest an extreme value theory approach for the estimation of , departing from other liquidity management models that use a penalty rate over the demand of cash that exceeds the availability of liquid resources. The model was applied to Brazilian banks data resulting in gains over the optimization without liquidity considerations that are robust under several tests, giving empirical indications that the model may have a relevant impact on the value creation in banks.

    Estimating Intertemporal Allocation Parameters using Synthetic Residual Estimation

    Get PDF
    We present a novel structural estimation procedure for models of intertemporal allocation. This is based on modelling expectation errors directly; we refer to it as Synthetic Residual Estimation (SRE). The flexibility of SRE allows us to account for measurement error in consumption and for heterogeneity in discount factors and coefficients of relative risk aversion. An investigation of the small sample properties of the SRE estimator indicates that it dominates GMM estimation of both exact and approximate Euler equations in the case when we have short panels with noisy consumption data. We apply SRE to two panels drawn from the PSID and estimate the joint distribution of the discount factor and the coefficient of relative risk aversion. We reject strongly homogeneity of the discount factors and the coefficient of relative risk aversion. We find that, on average, the more educated are more patient and more risk averse than the less educated. Within education strata, patience and risk aversion are negatively correlated

    The Housing Market(s) of San Diego

    Get PDF
    This paper uses an assignment model to understand the cross section of house prices within a metro area. Movers’ demand for housing is derived from a lifecycle problem with credit market frictions. Equilibrium house prices adjust to assign houses that differ by quality to movers who differ by age, income and wealth. To quantify the model, we measure distributions of house prices, house qualities and mover characteristics from micro data on San Diego County during the 2000s boom. The main result is that cheaper credit for poor households was a major driver of prices, especially at the low end of the market.
    • 

    corecore