3 research outputs found

    Bounds on mutual information of mixture data for classification tasks

    Full text link
    The data for many classification problems, such as pattern and speech recognition, follow mixture distributions. To quantify the optimum performance for classification tasks, the Shannon mutual information is a natural information-theoretic metric, as it is directly related to the probability of error. The mutual information between mixture data and the class label does not have an analytical expression, nor any efficient computational algorithms. We introduce a variational upper bound, a lower bound, and three estimators, all employing pair-wise divergences between mixture components. We compare the new bounds and estimators with Monte Carlo stochastic sampling and bounds derived from entropy bounds. To conclude, we evaluate the performance of the bounds and estimators through numerical simulations

    MaxEnt Upper Bounds for the Differential Entropy of Univariate Continuous Distributions

    No full text
    We present a series of closed-form upper bounds of the differential entropy of univariate continuous distributions based on the maximum entropy principle. We apply those bounds to Gaussian mixture models, and study their tightness propertie

    MaxEnt Upper Bounds for the Differential Entropy of Univariate Continuous Distributions

    No full text
    corecore