3,231 research outputs found
Coding Theory and Algebraic Combinatorics
This chapter introduces and elaborates on the fruitful interplay of coding
theory and algebraic combinatorics, with most of the focus on the interaction
of codes with combinatorial designs, finite geometries, simple groups, sphere
packings, kissing numbers, lattices, and association schemes. In particular,
special interest is devoted to the relationship between codes and combinatorial
designs. We describe and recapitulate important results in the development of
the state of the art. In addition, we give illustrative examples and
constructions, and highlight recent advances. Finally, we provide a collection
of significant open problems and challenges concerning future research.Comment: 33 pages; handbook chapter, to appear in: "Selected Topics in
Information and Coding Theory", ed. by I. Woungang et al., World Scientific,
Singapore, 201
Error Correcting Coding for a Non-symmetric Ternary Channel
Ternary channels can be used to model the behavior of some memory devices,
where information is stored in three different levels. In this paper, error
correcting coding for a ternary channel where some of the error transitions are
not allowed, is considered. The resulting channel is non-symmetric, therefore
classical linear codes are not optimal for this channel. We define the
maximum-likelihood (ML) decoding rule for ternary codes over this channel and
show that it is complex to compute, since it depends on the channel error
probability. A simpler alternative decoding rule which depends only on code
properties, called \da-decoding, is then proposed. It is shown that
\da-decoding and ML decoding are equivalent, i.e., \da-decoding is optimal,
under certain conditions. Assuming \da-decoding, we characterize the error
correcting capabilities of ternary codes over the non-symmetric ternary
channel. We also derive an upper bound and a constructive lower bound on the
size of codes, given the code length and the minimum distance. The results
arising from the constructive lower bound are then compared, for short sizes,
to optimal codes (in terms of code size) found by a clique-based search. It is
shown that the proposed construction method gives good codes, and that in some
cases the codes are optimal.Comment: Submitted to IEEE Transactions on Information Theory. Part of this
work was presented at the Information Theory and Applications Workshop 200
Syntactic Structures and Code Parameters
We assign binary and ternary error-correcting codes to the data of syntactic
structures of world languages and we study the distribution of code points in
the space of code parameters. We show that, while most codes populate the lower
region approximating a superposition of Thomae functions, there is a
substantial presence of codes above the Gilbert-Varshamov bound and even above
the asymptotic bound and the Plotkin bound. We investigate the dynamics induced
on the space of code parameters by spin glass models of language change, and
show that, in the presence of entailment relations between syntactic parameters
the dynamics can sometimes improve the code. For large sets of languages and
syntactic data, one can gain information on the spin glass dynamics from the
induced dynamics in the space of code parameters.Comment: 14 pages, LaTeX, 12 png figure
Principles and Parameters: a coding theory perspective
We propose an approach to Longobardi's parametric comparison method (PCM) via
the theory of error-correcting codes. One associates to a collection of
languages to be analyzed with the PCM a binary (or ternary) code with one code
words for each language in the family and each word consisting of the binary
values of the syntactic parameters of the language, with the ternary case
allowing for an additional parameter state that takes into account phenomena of
entailment of parameters. The code parameters of the resulting code can be
compared with some classical bounds in coding theory: the asymptotic bound, the
Gilbert-Varshamov bound, etc. The position of the code parameters with respect
to some of these bounds provides quantitative information on the variability of
syntactic parameters within and across historical-linguistic families. While
computations carried out for languages belonging to the same family yield codes
below the GV curve, comparisons across different historical families can give
examples of isolated codes lying above the asymptotic bound.Comment: 11 pages, LaTe
Entropic bounds on coding for noisy quantum channels
In analogy with its classical counterpart, a noisy quantum channel is
characterized by a loss, a quantity that depends on the channel input and the
quantum operation performed by the channel. The loss reflects the transmission
quality: if the loss is zero, quantum information can be perfectly transmitted
at a rate measured by the quantum source entropy. By using block coding based
on sequences of n entangled symbols, the average loss (defined as the overall
loss of the joint n-symbol channel divided by n, when n tends to infinity) can
be made lower than the loss for a single use of the channel. In this context,
we examine several upper bounds on the rate at which quantum information can be
transmitted reliably via a noisy channel, that is, with an asymptotically
vanishing average loss while the one-symbol loss of the channel is non-zero.
These bounds on the channel capacity rely on the entropic Singleton bound on
quantum error-correcting codes [Phys. Rev. A 56, 1721 (1997)]. Finally, we
analyze the Singleton bounds when the noisy quantum channel is supplemented
with a classical auxiliary channel.Comment: 20 pages RevTeX, 10 Postscript figures. Expanded Section II, added 1
figure, changed title. To appear in Phys. Rev. A (May 98
- …