13,582 research outputs found
List Decoding Tensor Products and Interleaved Codes
We design the first efficient algorithms and prove new combinatorial bounds
for list decoding tensor products of codes and interleaved codes. We show that
for {\em every} code, the ratio of its list decoding radius to its minimum
distance stays unchanged under the tensor product operation (rather than
squaring, as one might expect). This gives the first efficient list decoders
and new combinatorial bounds for some natural codes including multivariate
polynomials where the degree in each variable is bounded. We show that for {\em
every} code, its list decoding radius remains unchanged under -wise
interleaving for an integer . This generalizes a recent result of Dinur et
al \cite{DGKS}, who proved such a result for interleaved Hadamard codes
(equivalently, linear transformations). Using the notion of generalized Hamming
weights, we give better list size bounds for {\em both} tensoring and
interleaving of binary linear codes. By analyzing the weight distribution of
these codes, we reduce the task of bounding the list size to bounding the
number of close-by low-rank codewords. For decoding linear transformations,
using rank-reduction together with other ideas, we obtain list size bounds that
are tight over small fields.Comment: 32 page
Some Applications of Coding Theory in Computational Complexity
Error-correcting codes and related combinatorial constructs play an important
role in several recent (and old) results in computational complexity theory. In
this paper we survey results on locally-testable and locally-decodable
error-correcting codes, and their applications to complexity theory and to
cryptography.
Locally decodable codes are error-correcting codes with sub-linear time
error-correcting algorithms. They are related to private information retrieval
(a type of cryptographic protocol), and they are used in average-case
complexity and to construct ``hard-core predicates'' for one-way permutations.
Locally testable codes are error-correcting codes with sub-linear time
error-detection algorithms, and they are the combinatorial core of
probabilistically checkable proofs
Lists that are smaller than their parts: A coding approach to tunable secrecy
We present a new information-theoretic definition and associated results,
based on list decoding in a source coding setting. We begin by presenting
list-source codes, which naturally map a key length (entropy) to list size. We
then show that such codes can be analyzed in the context of a novel
information-theoretic metric, \epsilon-symbol secrecy, that encompasses both
the one-time pad and traditional rate-based asymptotic metrics, but, like most
cryptographic constructs, can be applied in non-asymptotic settings. We derive
fundamental bounds for \epsilon-symbol secrecy and demonstrate how these bounds
can be achieved with MDS codes when the source is uniformly distributed. We
discuss applications and implementation issues of our codes.Comment: Allerton 2012, 8 page
- …