slides

Conditional Information Inequalities for Entropic and Almost Entropic Points

Abstract

We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy that hold for distributions whose entropies meet some linear constraints. We prove that some conditional information inequalities cannot be extended to any unconditional linear inequalities. Some of these conditional inequalities hold for almost entropic points, while others do not. We also discuss some counterparts of conditional information inequalities for Kolmogorov complexity.Comment: Submitted to the IEEE Transactions on Information Theor

    Similar works

    Full text

    thumbnail-image

    Available Versions