620 research outputs found
Novel Lower Bounds on the Entropy Rate of Binary Hidden Markov Processes
Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma,
where the output entropy of a binary symmetric channel is bounded in terms of
the average entropy of the input projected on a random subset of coordinates.
Here, this result is applied for deriving novel lower bounds on the entropy
rate of binary hidden Markov processes. For symmetric underlying Markov
processes, our bound improves upon the best known bound in the very noisy
regime. The nonsymmetric case is also considered, and explicit bounds are
derived for Markov processes that satisfy the -RLL constraint
Integer-Forcing Source Coding
Integer-Forcing (IF) is a new framework, based on compute-and-forward, for
decoding multiple integer linear combinations from the output of a Gaussian
multiple-input multiple-output channel. This work applies the IF approach to
arrive at a new low-complexity scheme, IF source coding, for distributed lossy
compression of correlated Gaussian sources under a minimum mean squared error
distortion measure. All encoders use the same nested lattice codebook. Each
encoder quantizes its observation using the fine lattice as a quantizer and
reduces the result modulo the coarse lattice, which plays the role of binning.
Rather than directly recovering the individual quantized signals, the decoder
first recovers a full-rank set of judiciously chosen integer linear
combinations of the quantized signals, and then inverts it. In general, the
linear combinations have smaller average powers than the original signals. This
allows to increase the density of the coarse lattice, which in turn translates
to smaller compression rates. We also propose and analyze a one-shot version of
IF source coding, that is simple enough to potentially lead to a new design
principle for analog-to-digital converters that can exploit spatial
correlations between the sampled signals.Comment: Submitted to IEEE Transactions on Information Theor
How to Quantize Outputs of a Binary Symmetric Channel to Bits?
Suppose that is obtained by observing a uniform Bernoulli random vector
through a binary symmetric channel with crossover probability .
The "most informative Boolean function" conjecture postulates that the maximal
mutual information between and any Boolean function is
attained by a dictator function. In this paper, we consider the "complementary"
case in which the Boolean function is replaced by
, namely, an bit
quantizer, and show that
for any such . Thus, in this case, the optimal function is of the form
.Comment: 5 pages, accepted ISIT 201
Cyclic-Coded Integer-Forcing Equalization
A discrete-time intersymbol interference channel with additive Gaussian noise
is considered, where only the receiver has knowledge of the channel impulse
response. An approach for combining decision-feedback equalization with channel
coding is proposed, where decoding precedes the removal of intersymbol
interference. This is accomplished by combining the recently proposed
integer-forcing equalization approach with cyclic block codes. The channel
impulse response is linearly equalized to an integer-valued response. This is
then utilized by leveraging the property that a cyclic code is closed under
(cyclic) integer-valued convolution. Explicit bounds on the performance of the
proposed scheme are also derived
Precoded Integer-Forcing Universally Achieves the MIMO Capacity to Within a Constant Gap
An open-loop single-user multiple-input multiple-output communication scheme
is considered where a transmitter, equipped with multiple antennas, encodes the
data into independent streams all taken from the same linear code. The coded
streams are then linearly precoded using the encoding matrix of a perfect
linear dispersion space-time code. At the receiver side, integer-forcing
equalization is applied, followed by standard single-stream decoding. It is
shown that this communication architecture achieves the capacity of any
Gaussian multiple-input multiple-output channel up to a gap that depends only
on the number of transmit antennas.Comment: to appear in the IEEE Transactions on Information Theor
Successive Integer-Forcing and its Sum-Rate Optimality
Integer-forcing receivers generalize traditional linear receivers for the
multiple-input multiple-output channel by decoding integer-linear combinations
of the transmitted streams, rather then the streams themselves. Previous works
have shown that the additional degree of freedom in choosing the integer
coefficients enables this receiver to approach the performance of
maximum-likelihood decoding in various scenarios. Nonetheless, even for the
optimal choice of integer coefficients, the additive noise at the equalizer's
output is still correlated. In this work we study a variant of integer-forcing,
termed successive integer-forcing, that exploits these noise correlations to
improve performance. This scheme is the integer-forcing counterpart of
successive interference cancellation for traditional linear receivers.
Similarly to the latter, we show that successive integer-forcing is capacity
achieving when it is possible to optimize the rate allocation to the different
streams. In comparison to standard successive interference cancellation
receivers, the successive integer-forcing receiver offers more possibilities
for capacity achieving rate tuples, and in particular, ones that are more
balanced.Comment: A shorter version was submitted to the 51st Allerton Conferenc
- β¦