96,282 research outputs found

    On the globalization of stock markets: An application of Vector Error Correction Model, Mutual Information and Singular Spectrum Analysis to the G7 countries

    Get PDF
    This paper analyzes stock market relationships among the G7 countries between 1973 and 2009 using three different approaches: (i) a linear approach based on cointegration, Vector Error Correction (VECM) and Granger Causality; (ii) a nonlinear approach based on Mutual Information and the Global Correlation Coefficient; and (iii) a nonlinear approach based on Singular Spectrum Analysis (SSA). While the cointegration tests are based on regression models and capture linearities in the data, Mutual Information and Singular Spectrum Analysis capture nonlinear relationships in a non-parametric way. The framework of this paper is based on the notion of market integration and uses stock market correlations and linkages both in price levels and returns. The main results show that significant co-movements occur among most of the G7 countries over the period analyzed and that Mutual Information and the Global Correlation Coefficient actually seem to provide more information about the market relationships than the Vector Error Correction Model and Granger Causality. However, unlike the latter, the direction of causality is difficult to distinguish in Mutual Information and the Global Correlation Coefficient. In this respect, the nonlinear Singular Spectrum Analysis technique displays several advantages, since it enabled us to capture nonlinear causality in both directions, while Granger Causality only captures causality in a linear way. The results also show that stock markets are closely linked both in terms of price levels and returns (as well as lagged returns) over the 36 years analyzed

    The chain rule implies Tsirelson's bound: an approach from generalized mutual information

    Full text link
    In order to analyze an information theoretical derivation of Tsirelson's bound based on information causality, we introduce a generalized mutual information (GMI), defined as the optimal coding rate of a channel with classical inputs and general probabilistic outputs. In the case where the outputs are quantum, the GMI coincides with the quantum mutual information. In general, the GMI does not necessarily satisfy the chain rule. We prove that Tsirelson's bound can be derived by imposing the chain rule on the GMI. We formulate a principle, which we call the no-supersignalling condition, which states that the assistance of nonlocal correlations does not increase the capability of classical communication. We prove that this condition is equivalent to the no-signalling condition. As a result, we show that Tsirelson's bound is implied by the nonpositivity of the quantitative difference between information causality and no-supersignalling.Comment: 23 pages, 8 figures, Added Section 2 and Appendix B, result unchanged, Added reference

    On the globalization of stock markets: An application of VECM, SSA technique and mutual information to the G7?

    Get PDF
    This paper analyzes the process of stock market globalization on the basis of two different approaches: (i) the linear one, based on cointegration tests and vector error correction models (VECM); and (ii) the nonlinear approach, based on Singular Spectrum Analysis (SSA) and mutual information tests. While the cointegration tests are based on regression models and typically capture linearities in the data, mutual information and SSA are well suited for capturing global non-parametric relationships in the data without imposing any structure or restriction on the model. The data used in our empirical analysis were drawn from DataStream and comprise the natural logarithms of relative stock market indexes since 1973 for the G7 countries. The main results point to the conclusion that significant causal effects occur in this context and that mutual information and the global correlation coefficient actually provide more information on this process than VECM, but the direction of causality is difficult to distinguish in the former case. In this field, SSA shows some advantages, since it enabled us to capture the nonlinear causality in both directions. In all cases, however, there is evidence that stock markets are closely related in the long-run over the 36 years analyzed and, in this sense, one may say that they are globalized.Globalization; Market integration; VECM; Mutual information; SSA technique.

    Information network modeling for U.S. banking systemic risk

    Get PDF
    In this work we investigate whether information theory measures like mutual information and transfer entropy, extracted from a bank network, Granger cause financial stress indexes like LIBOR-OIS (London Interbank Offered Rate-Overnight Index Swap) spread, STLFSI (St. Louis Fed Financial Stress Index) and USD/CHF (USA Dollar/Swiss Franc) exchange rate. The information theory measures are extracted from a Gaussian Graphical Model constructed from daily stock time series of the top 74 listed US banks. The graphical model is calculated with a recently developed algorithm (LoGo) which provides very fast inference model that allows us to update the graphical model each market day. We therefore can generate daily time series of mutual information and transfer entropy for each bank of the network. The Granger causality between the bank related measures and the financial stress indexes is investigated with both standard Granger-causality and Partial Granger-causality conditioned on control measures representative of the general economy conditions

    Beyond Normal: On the Evaluation of Mutual Information Estimators

    Full text link
    Mutual information is a general statistical dependency measure which has found applications in representation learning, causality, domain generalization and computational biology. However, mutual information estimators are typically evaluated on simple families of probability distributions, namely multivariate normal distribution and selected distributions with one-dimensional random variables. In this paper, we show how to construct a diverse family of distributions with known ground-truth mutual information and propose a language-independent benchmarking platform for mutual information estimators. We discuss the general applicability and limitations of classical and neural estimators in settings involving high dimensions, sparse interactions, long-tailed distributions, and high mutual information. Finally, we provide guidelines for practitioners on how to select appropriate estimator adapted to the difficulty of problem considered and issues one needs to consider when applying an estimator to a new data set.Comment: Accepted at NeurIPS 2023. Code available at https://github.com/cbg-ethz/bm

    Information Flow in Computational Systems

    Full text link
    We develop a theoretical framework for defining and identifying flows of information in computational systems. Here, a computational system is assumed to be a directed graph, with "clocked" nodes that send transmissions to each other along the edges of the graph at discrete points in time. We are interested in a definition that captures the dynamic flow of information about a specific message, and which guarantees an unbroken "information path" between appropriately defined inputs and outputs in the directed graph. Prior measures, including those based on Granger Causality and Directed Information, fail to provide clear assumptions and guarantees about when they correctly reflect information flow about a message. We take a systematic approach---iterating through candidate definitions and counterexamples---to arrive at a definition for information flow that is based on conditional mutual information, and which satisfies desirable properties, including the existence of information paths. Finally, we describe how information flow might be detected in a noiseless setting, and provide an algorithm to identify information paths on the time-unrolled graph of a computational system.Comment: Significantly revised version which was accepted for publication at the IEEE Transactions on Information Theor

    Comments on the black hole information problem

    Full text link
    String theory provides numerous examples of duality between gravitational theories and unitary gauge theories. To resolve the black hole information paradox in this setting, it is necessary to better understand how unitarity is implemented on the gravity side. We argue that unitarity is restored by nonlocal effects whose initial magnitude is suppressed by the exponential of the Bekenstein-Hawking entropy. Time-slicings for which effective field theory is valid are obtained by demanding the mutual back-reaction of quanta be small. The resulting bounds imply that nonlocal effects do not lead to observable violations of causality or conflict with the equivalence principle for infalling observers, yet implement information retrieval for observers who stay outside the black hole.Comment: 18 pages, 2 figures, revtex, v2 figure added and some improvements to presentatio
    • …
    corecore