14,393 research outputs found
Streaming an image through the eye: The retina seen as a dithered scalable image coder
We propose the design of an original scalable image coder/decoder that is
inspired from the mammalians retina. Our coder accounts for the time-dependent
and also nondeterministic behavior of the actual retina. The present work
brings two main contributions: As a first step, (i) we design a deterministic
image coder mimicking most of the retinal processing stages and then (ii) we
introduce a retinal noise in the coding process, that we model here as a dither
signal, to gain interesting perceptual features. Regarding our first
contribution, our main source of inspiration will be the biologically plausible
model of the retina called Virtual Retina. The main novelty of this coder is to
show that the time-dependent behavior of the retina cells could ensure, in an
implicit way, scalability and bit allocation. Regarding our second
contribution, we reconsider the inner layers of the retina. We emit a possible
interpretation for the non-determinism observed by neurophysiologists in their
output. For this sake, we model the retinal noise that occurs in these layers
by a dither signal. The dithering process that we propose adds several
interesting features to our image coder. The dither noise whitens the
reconstruction error and decorrelates it from the input stimuli. Furthermore,
integrating the dither noise in our coder allows a faster recognition of the
fine details of the image during the decoding process. Our present paper goal
is twofold. First, we aim at mimicking as closely as possible the retina for
the design of a novel image coder while keeping encouraging performances.
Second, we bring a new insight concerning the non-deterministic behavior of the
retina.Comment: arXiv admin note: substantial text overlap with arXiv:1104.155
Sparse Signal Processing Concepts for Efficient 5G System Design
As it becomes increasingly apparent that 4G will not be able to meet the
emerging demands of future mobile communication systems, the question what
could make up a 5G system, what are the crucial challenges and what are the key
drivers is part of intensive, ongoing discussions. Partly due to the advent of
compressive sensing, methods that can optimally exploit sparsity in signals
have received tremendous attention in recent years. In this paper we will
describe a variety of scenarios in which signal sparsity arises naturally in 5G
wireless systems. Signal sparsity and the associated rich collection of tools
and algorithms will thus be a viable source for innovation in 5G wireless
system design. We will discribe applications of this sparse signal processing
paradigm in MIMO random access, cloud radio access networks, compressive
channel-source network coding, and embedded security. We will also emphasize
important open problem that may arise in 5G system design, for which sparsity
will potentially play a key role in their solution.Comment: 18 pages, 5 figures, accepted for publication in IEEE Acces
Receive Combining vs. Multi-Stream Multiplexing in Downlink Systems with Multi-Antenna Users
In downlink multi-antenna systems with many users, the multiplexing gain is
strictly limited by the number of transmit antennas and the use of these
antennas. Assuming that the total number of receive antennas at the
multi-antenna users is much larger than , the maximal multiplexing gain can
be achieved with many different transmission/reception strategies. For example,
the excess number of receive antennas can be utilized to schedule users with
effective channels that are near-orthogonal, for multi-stream multiplexing to
users with well-conditioned channels, and/or to enable interference-aware
receive combining. In this paper, we try to answer the question if the data
streams should be divided among few users (many streams per user) or many users
(few streams per user, enabling receive combining). Analytic results are
derived to show how user selection, spatial correlation, heterogeneous user
conditions, and imperfect channel acquisition (quantization or estimation
errors) affect the performance when sending the maximal number of streams or
one stream per scheduled user---the two extremes in data stream allocation.
While contradicting observations on this topic have been reported in prior
works, we show that selecting many users and allocating one stream per user
(i.e., exploiting receive combining) is the best candidate under realistic
conditions. This is explained by the provably stronger resilience towards
spatial correlation and the larger benefit from multi-user diversity. This
fundamental result has positive implications for the design of downlink systems
as it reduces the hardware requirements at the user devices and simplifies the
throughput optimization.Comment: Published in IEEE Transactions on Signal Processing, 16 pages, 11
figures. The results can be reproduced using the following Matlab code:
https://github.com/emilbjornson/one-or-multiple-stream
- …