69 research outputs found

    Two Measures of Dependence

    Full text link
    Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order α\alpha and the relative α\alpha-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α\alpha is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.Comment: 40 pages; 1 figure; published in Entrop

    Asymptotic behavior of measures of dependence for ARMA(1,2) models with stable innovations. Stationary and non-stationary coefficients

    Get PDF
    We derive the asymptotic behavior of two measures of dependence (Codifference and Covariation) for ARMA(1,2) models with symmetric alpha-stable innovations and non-stationary coefficients.ARMA model; Stable distribution; Codifference; Covariation;

    Estimation and comparison of signed symmetric covariation coefficient and generalized association parameter for alpha-stable dependence modeling

    Get PDF
    Accepté à Communications in Statistics - Theory and methodsInternational audienceIn this paper we study the estimators of two measures of dependence: the signed symmetric covariation coefficient proposed by Garel and Kodia and the generalized association parameter put forward by Paulauskas. In the sub-Gaussian case, the signed symmetric covariation coefficient and the generalized association parameter coincide. The estimator of the signed symmetric covariation coefficient proposed here is based on fractional lower-order moments. The estimator of the generalized association parameter is based on estimation of a stable spectral measure. We investigate the relative performance of these estimators by comparing results from simulations

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur

    Temporal structure and gain/loss asymmetry for real and artificial stock indices

    Full text link
    We demonstrate that the gain/loss asymmetry observed for stock indices vanishes if the temporal dependence structure is destroyed by scrambling the time series. We also show that an artificial index constructed by a simple average of a number of individual stocks display gain/loss asymmetry - this allows us to explicitly analyze the dependence between the index constituents. We consider mutual information and correlation based measures and show that the stock returns indeed have a higher degree of dependence in times of market downturns than upturns
    • …
    corecore