130,560 research outputs found
On row-by-row coding for 2-D constraints
A constant-rate encoder--decoder pair is presented for a fairly large family
of two-dimensional (2-D) constraints. Encoding and decoding is done in a
row-by-row manner, and is sliding-block decodable.
Essentially, the 2-D constraint is turned into a set of independent and
relatively simple one-dimensional (1-D) constraints; this is done by dividing
the array into fixed-width vertical strips. Each row in the strip is seen as a
symbol, and a graph presentation of the respective 1-D constraint is
constructed. The maxentropic stationary Markov chain on this graph is next
considered: a perturbed version of the corresponding probability distribution
on the edges of the graph is used in order to build an encoder which operates
in parallel on the strips. This perturbation is found by means of a network
flow, with upper and lower bounds on the flow through the edges.
A key part of the encoder is an enumerative coder for constant-weight binary
words. A fast realization of this coder is shown, using floating-point
arithmetic
Cooperative Data Exchange based on MDS Codes
The cooperative data exchange problem is studied for the fully connected
network. In this problem, each node initially only possesses a subset of the
packets making up the file. Nodes make broadcast transmissions that are
received by all other nodes. The goal is for each node to recover the full
file. In this paper, we present a polynomial-time deterministic algorithm to
compute the optimal (i.e., minimal) number of required broadcast transmissions
and to determine the precise transmissions to be made by the nodes. A
particular feature of our approach is that {\it each} of the
transmissions is a linear combination of {\it exactly} packets, and we
show how to optimally choose the value of We also show how the
coefficients of these linear combinations can be chosen by leveraging a
connection to Maximum Distance Separable (MDS) codes. Moreover, we show that
our method can be used to solve cooperative data exchange problems with
weighted cost as well as the so-called successive local omniscience problem.Comment: 21 pages, 1 figur
Successive Refinement with Decoder Cooperation and its Channel Coding Duals
We study cooperation in multi terminal source coding models involving
successive refinement. Specifically, we study the case of a single encoder and
two decoders, where the encoder provides a common description to both the
decoders and a private description to only one of the decoders. The decoders
cooperate via cribbing, i.e., the decoder with access only to the common
description is allowed to observe, in addition, a deterministic function of the
reconstruction symbols produced by the other. We characterize the fundamental
performance limits in the respective settings of non-causal, strictly-causal
and causal cribbing. We use a new coding scheme, referred to as Forward
Encoding and Block Markov Decoding, which is a variant of one recently used by
Cuff and Zhao for coordination via implicit communication. Finally, we use the
insight gained to introduce and solve some dual channel coding scenarios
involving Multiple Access Channels with cribbing.Comment: 55 pages, 15 figures, 8 tables, submitted to IEEE Transactions on
Information Theory. A shorter version submitted to ISIT 201
Making recommendations bandwidth aware
This paper asks how much we can gain in terms of bandwidth and user
satisfaction, if recommender systems became bandwidth aware and took into
account not only the user preferences, but also the fact that they may need to
serve these users under bandwidth constraints, as is the case over wireless
networks. We formulate this as a new problem in the context of index coding: we
relax the index coding requirements to capture scenarios where each client has
preferences associated with messages. The client is satisfied to receive any
message she does not already have, with a satisfaction proportional to her
preference for that message. We consistently find, over a number of scenarios
we sample, that although the optimization problems are in general NP-hard,
significant bandwidth savings are possible even when restricted to polynomial
time algorithms
- …