Abstract

Information plays an important role in our understanding of the physical world. We hence propose an entropic measure of information for any physical theory that admits systems, states and measurements. In the quantum and classical world, our measure reduces to the von Neumann and Shannon entropy respectively. It can even be used in a quantum or classical setting where we are only allowed to perform a limited set of operations. In a world that admits superstrong correlations in the form of non-local boxes, our measure can be used to analyze protocols such as superstrong random access encodings and the violation of `information causality'. However, we also show that in such a world no entropic measure can exhibit all properties we commonly accept in a quantum setting. For example, there exists no`reasonable' measure of conditional entropy that is subadditive. Finally, we prove a coding theorem for some theories that is analogous to the quantum and classical setting, providing us with an appealing operational interpretation.Comment: 20 pages, revtex, 7 figures, v2: Coding theorem revised, published versio

    Similar works