3,784 research outputs found

    Entropy-Based Financial Asset Pricing

    Full text link
    We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return - entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behaviour of the beta along with entropy.Comment: 21 pages, 6 figures, 3 tables and 4 supporting file

    Information-Theoretic Estimation of Preference Parameters: Macroeconomic Applications and Simulation Evidence

    Get PDF
    This paper investigates the behaviour of estimators based on the Kullback-Leibler information criterion (KLIC), as an alternative to the generalized method of moments (GMM). We first study the estimators in a Monte Carlo simulation model of consumption growth with power utility. Then we compare KLIC and GMM estimators in macroeconomic applications, in which preference parameters are estimated with aggregate data. KLIC probability measures serve as useful diagnostics. In dependent data, tests of overidentifying restrictions in the KLIC framework have size properties comparable to those of the J-test in iterated GMM, but superior size-adjusted power.KLIC estimation, generalized method of moments, Monte Carlo

    Information Theory and Knowledge-Gathering

    Get PDF
    It is assumed that human knowledge-building depends on a discrete sequential decision-making process subjected to a stochastic information transmitting environment. This environment randomly transmits Shannon type information-packets to the decision-maker, who examines each of them for relevancy and then determines his optimal choices. Using this set of relevant information-packets, the decision-maker adapts, over time, to the stochastic nature of his environment, and optimizes the subjective expected rate-of-growth of knowledge. The decision-maker’s optimal actions, lead to a decision function that involves his view of the subjective entropy of the environmental process and other important parameters at each stage of the process. Using this model of human behavior, one could create psychometric experiments using computer simulation and real decision-makers, to play programmed games to measure the resulting human performance.decision-making; dynamic programming; entropy; epistemology; information theory; knowledge; sequential processes; subjective probability

    Nonparametric estimation of composite functions

    Get PDF
    We study the problem of nonparametric estimation of a multivariate function g:RdRg:\mathbb {R}^d\to\mathbb{R} that can be represented as a composition of two unknown smooth functions f:RRf:\mathbb{R}\to\mathbb{R} and G:RdRG:\mathbb{R}^d\to \mathbb{R}. We suppose that ff and GG belong to known smoothness classes of functions, with smoothness γ\gamma and β\beta, respectively. We obtain the full description of minimax rates of estimation of gg in terms of γ\gamma and β\beta, and propose rate-optimal estimators for the sup-norm loss. For the construction of such estimators, we first prove an approximation result for composite functions that may have an independent interest, and then a result on adaptation to the local structure. Interestingly, the construction of rate-optimal estimators for composite functions (with given, fixed smoothness) needs adaptation, but not in the traditional sense: it is now adaptation to the local structure. We prove that composition models generate only two types of local structures: the local single-index model and the local model with roughness isolated to a single dimension (i.e., a model containing elements of both additive and single-index structure). We also find the zones of (γ\gamma, β\beta) where no local structure is generated, as well as the zones where the composition modeling leads to faster rates, as compared to the classical nonparametric rates that depend only to the overall smoothness of gg.Comment: Published in at http://dx.doi.org/10.1214/08-AOS611 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore