3 research outputs found
An information theoretic necessary condition for perfect reconstruction
This article proposes a new information theoretic necessary condition for
reconstructing a discrete random variable based on the knowledge of a set
of discrete functions of . The reconstruction condition is derived from the
Shannon's Lattice of Information (LoI) \cite{Shannon53} and two entropic
metrics proposed respectively by Shannon and Rajski.
This theoretical material being relatively unknown and/or dispersed in
different references, we provide a complete and synthetic description of the
LoI concepts like the total, common and complementary informations with
complete proofs. The two entropic metrics definitions and properties are also
fully detailled and showed compatible with the LoI structure. A new geometric
interpretation of the Lattice structure is then investigated that leads to a
new necessary condition for reconstructing the discrete random variable
given a set ,..., of elements of the lattice generated by
.
Finally, this condition is derived in five specific examples of
reconstruction of from a set of deterministic functions of : the
reconstruction of a symmetric random variable from the knowledge of its sign
and of its absolute value, the reconstruction of a binary word from a set of
binary linear combinations, the reconstruction of an integer from its prime
signature (Fundamental theorem of arithmetics) and from its reminders modulo a
set of coprime integers (Chinese reminder theorem), and the reconstruction of
the sorting permutation of a list from a set of 2-by-2 comparisons. In each
case, the necessary condition is shown compatible with the corresponding
well-known results.Comment: 17 pages, 9 figure