345,496 research outputs found

    Relative entropy as a measure of inhomogeneity in general relativity

    Full text link
    We introduce the notion of relative volume entropy for two spacetimes with preferred compact spacelike foliations. This is accomplished by applying the notion of Kullback-Leibler divergence to the volume elements induced on spacelike slices. The resulting quantity gives a lower bound on the number of bits which are necessary to describe one metric given the other. For illustration, we study some examples, in particular gravitational waves, and conclude that the relative volume entropy is a suitable device for quantitative comparison of the inhomogeneity of two spacetimes.Comment: 15 pages, 7 figure

    Entropy as a Measure of Quality of XML Schema Document

    Get PDF
    In this paper, a metric for the assessment of the structural complexity of eXtensible Markup Language schema document is formulated. The present metric ‘Schema Entropy is based on entropy concept and intended to measure the complexity of the schema documents written in W3C XML Schema Language due to diversity in the structures of its elements. The SE is useful in evaluating the efficiency of the design of Schemas. A good design reduces the maintainability efforts. Therefore, our metric provides valuable information about the reliability and maintainability of systems. In this respect, this metric is believed to be a valuable contribution for improving the quality of XML-based systems. It is demonstrated with examples and validated empirically through actual test cases

    Increment entropy as a measure of complexity for time series

    Full text link
    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.Comment: 12pages,7figure,2 table

    On Classical Analogues of Free Entropy Dimension

    Get PDF
    We define a classical probability analogue of Voiculescu's free entropy dimension that we shall call the classical probability entropy dimension of a probability measure on Rn\mathbb{R}^n. We show that the classical probability entropy dimension of a measure is related with diverse other notions of dimension. First, it can be viewed as a kind of fractal dimension. Second, if one extends Bochner's inequalities to a measure by requiring that microstates around this measure asymptotically satisfy the classical Bochner's inequalities, then we show that the classical probability entropy dimension controls the rate of increase of optimal constants in Bochner's inequality for a measure regularized by convolution with the Gaussian law as the regularization is removed. We introduce a free analogue of the Bochner inequality and study the related free entropy dimension quantity. We show that it is greater or equal to the non-microstates free entropy dimension
    • …
    corecore