3 research outputs found
Conditional Information Inequalities for Entropic and Almost Entropic Points
We study conditional linear information inequalities, i.e., linear
inequalities for Shannon entropy that hold for distributions whose entropies
meet some linear constraints. We prove that some conditional information
inequalities cannot be extended to any unconditional linear inequalities. Some
of these conditional inequalities hold for almost entropic points, while others
do not. We also discuss some counterparts of conditional information
inequalities for Kolmogorov complexity.Comment: Submitted to the IEEE Transactions on Information Theor