299,171 research outputs found

    Distance entropy cartography characterises centrality in complex networks

    Full text link
    We introduce distance entropy as a measure of homogeneity in the distribution of path lengths between a given node and its neighbours in a complex network. Distance entropy defines a new centrality measure whose properties are investigated for a variety of synthetic network models. By coupling distance entropy information with closeness centrality, we introduce a network cartography which allows one to reduce the degeneracy of ranking based on closeness alone. We apply this methodology to the empirical multiplex lexical network encoding the linguistic relationships known to English speaking toddlers. We show that the distance entropy cartography better predicts how children learn words compared to closeness centrality. Our results highlight the importance of distance entropy for gaining insights from distance patterns in complex networks.Comment: 11 page

    The entropy distance between the Wiener and stationary Gaussian measures

    Get PDF
    Investigating the entropy distance between the Wiener measure, W-t0,W- (tau) and stationary Gaussian measures, Q(t0, tau) on the space of continuous functions C[t(0) - tau, t(0) + tau], we show that in some cases this distance can essentially be computed. This is done by explicitly computing a related quantity which in effect is a valid approximation of the entropy distance, provided it is sufficiently small; this will be the case if tau/t(0) is small. We prove that H(Wt(0, tau), Q(t0, tau)) > tau/2(t0), and then show that tau/2t(0) is essentially the typical case of such entropy distance, provided the mean and the variance of the stationary measures are set "appropriately". Utilizing a similar technique, we estimate the entropy distance between the Ornstein-Uhlenbeck measure and other stationary Gaussian measures on C[1 - tau, 1 + tau]. Using this result combined with a variant of the triangle inequality for the entropy distance, which we devise, yields an upper bound on the entropy distance between stationary Gaussian measures which are absolutely continuous with respect to the Wiener measure

    Entropy Distance: New Quantum Phenomena

    Full text link
    We study a curve of Gibbsian families of complex 3x3-matrices and point out new features, absent in commutative finite-dimensional algebras: a discontinuous maximum-entropy inference, a discontinuous entropy distance and non-exposed faces of the mean value set. We analyze these problems from various aspects including convex geometry, topology and information geometry. This research is motivated by a theory of info-max principles, where we contribute by computing first order optimality conditions of the entropy distance.Comment: 34 pages, 5 figure

    Absolute continuity between the Wiener and stationary Gaussian measures

    Get PDF
    It is known that the entropy distance between two Gaussian measures is finite if, and only if, they are absolutely continuous with respect to one another. Shepp (1966) characterized the correlations corresponding to stationary Gaussian measures that are absolutely continuous with respect to the Wiener measure. By analyzing the entropy distance, we show that one of his conditions, involving the spectrum of an associated operator, is essentially extraneous, providing a simple criterion for finite entropy distance in this case

    Continuity bounds on the quantum relative entropy

    Full text link
    The quantum relative entropy is frequently used as a distance, or distinguishability measure between two quantum states. In this paper we study the relation between this measure and a number of other measures used for that purpose, including the trace norm distance. More precisely, we derive lower and upper bounds on the relative entropy in terms of various distance measures for the difference of the states based on unitarily invariant norms. The upper bounds can be considered as statements of continuity of the relative entropy distance in the sense of Fannes. We employ methods from optimisation theory to obtain bounds that are as sharp as possible.Comment: 13 pages (ReVTeX), 3 figures, replaced with published versio

    On Variational Expressions for Quantum Relative Entropies

    Get PDF
    Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statistics, is strictly smaller than Umegaki's quantum relative entropy whenever the states do not commute. We extend this result in two ways. First, we show that Petz' conclusion remains true if we allow general positive operator valued measures. Second, we extend the result to Renyi relative entropies and show that for non-commuting states the sandwiched Renyi relative entropy is strictly larger than the measured Renyi relative entropy for α∈(12,∞)\alpha \in (\frac12, \infty), and strictly smaller for α∈[0,12)\alpha \in [0,\frac12). The latter statement provides counterexamples for the data-processing inequality of the sandwiched Renyi relative entropy for α<12\alpha < \frac12. Our main tool is a new variational expression for the measured Renyi relative entropy, which we further exploit to show that certain lower bounds on quantum conditional mutual information are superadditive.Comment: v2: final published versio
    • …
    corecore