We present information-theoretic definitions and results for analyzing
symmetric-key encryption schemes beyond the perfect secrecy regime, i.e. when
perfect secrecy is not attained. We adopt two lines of analysis, one based on
lossless source coding, and another akin to rate-distortion theory. We start by
presenting a new information-theoretic metric for security, called symbol
secrecy, and derive associated fundamental bounds. We then introduce
list-source codes (LSCs), which are a general framework for mapping a key
length (entropy) to a list size that an eavesdropper has to resolve in order to
recover a secret message. We provide explicit constructions of LSCs, and
demonstrate that, when the source is uniformly distributed, the highest level
of symbol secrecy for a fixed key length can be achieved through a construction
based on minimum-distance separable (MDS) codes. Using an analysis related to
rate-distortion theory, we then show how symbol secrecy can be used to
determine the probability that an eavesdropper correctly reconstructs functions
of the original plaintext. We illustrate how these bounds can be applied to
characterize security properties of symmetric-key encryption schemes, and, in
particular, extend security claims based on symbol secrecy to a functional
setting.Comment: Submitted to IEEE Transactions on Information Theor