175,447 research outputs found

    On Empirical Entropy

    Get PDF
    We propose a compression-based version of the empirical entropy of a finite string over a finite alphabet. Whereas previously one considers the naked entropy of (possibly higher order) Markov processes, we consider the sum of the description of the random variable involved plus the entropy it induces. We assume only that the distribution involved is computable. To test the new notion we compare the Normalized Information Distance (the similarity metric) with a related measure based on Mutual Information in Shannon's framework. This way the similarities and differences of the last two concepts are exposed.Comment: 14 pages, LaTe

    On Empirical Entropy

    Get PDF

    Distance entropy cartography characterises centrality in complex networks

    Full text link
    We introduce distance entropy as a measure of homogeneity in the distribution of path lengths between a given node and its neighbours in a complex network. Distance entropy defines a new centrality measure whose properties are investigated for a variety of synthetic network models. By coupling distance entropy information with closeness centrality, we introduce a network cartography which allows one to reduce the degeneracy of ranking based on closeness alone. We apply this methodology to the empirical multiplex lexical network encoding the linguistic relationships known to English speaking toddlers. We show that the distance entropy cartography better predicts how children learn words compared to closeness centrality. Our results highlight the importance of distance entropy for gaining insights from distance patterns in complex networks.Comment: 11 page

    Vibrational thermodynamics: coupling of chemical order and size effects

    Get PDF
    The effects of chemical order on the vibrational entropy have been studied using first-principles and semi-empirical potential methods. Pseudopotential calculations on the Pd_3V system show that the vibrational entropy decreases by 0.07k_B upon disordering in the high-temperature limit. The decrease in entropy contradicts what would be expected from simple bonding arguments, but can be explained by the influence of size effects on the vibrations. In addition, the embedded-atom method is used to study the effects of local environments on the entropic contributions of individual Ni and Al atoms in Ni_3Al. It is found that increasing numbers of Al nearest neighbours decreases the vibrational entropy of an atom when relaxations are not included. When the system is relaxed, this effect disappears, and the local entropy is approximately uniform with increasing number of Al neighbours. These results are explained in terms of the large size mismatch between Ni and Al. In addition, a local cluster expansion is used to show how the relaxations increase the importance of long-range and multisite interactions

    Entropy-Based Financial Asset Pricing

    Full text link
    We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return - entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behaviour of the beta along with entropy.Comment: 21 pages, 6 figures, 3 tables and 4 supporting file

    A STRUCTURAL-EQUATION GME ESTIMATOR

    Get PDF
    A generalized maximum entropy estimator is developed for the linear simultaneous equations systems model. We provide results on large and small sample properties of the estimator. Empirical results illustrate efficiency advantages of the generalized maximum entropy estimator proposed in this study over traditional estimators (e.g., 2SLS and 3SLS).Research Methods/ Statistical Methods,
    • …
    corecore