2 research outputs found

    A Note on the Shannon Entropy of Short Sequences

    Full text link
    For source sequences of length L symbols we proposed to use a more realistic value to the usual benchmark of number of code letters by source letters. Our idea is based on a quantifier of information fluctuation of a source, F(U), which corresponds to the second central moment of the random variable that measures the information content of a source symbol. An alternative interpretation of typical sequences is additionally provided through this approach.Comment: 3 figure

    Collision Entropy Estimation in a One-Line Formula

    Get PDF
    We address the unsolved question of how best to estimate the collision entropy, also called quadratic or second order Rényi entropy. Integer-order Rényi entropies are synthetic indices useful for the characterization of probability distributions. In recent decades, numerous studies have been conducted to arrive at their valid estimates starting from experimental data, so to derive suitable classification methods for the underlying processes, but optimal solutions have not been reached yet. Limited to the estimation of collision entropy, a one-line formula is presented here. The results of some specific Monte Carlo experiments give evidence of the validity of this estimator even for the very low densities of the data spread in high-dimensional sample spaces. The method strengths are unbiased consistency, generality and minimum computational cost
    corecore