11,545 research outputs found
Joint Source Channel Coding in Broadcast and Relay Channels: A Non-Asymptotic End-to-End Distortion Approach
The paradigm of separate source-channel coding is inspired by Shannon's separation result, which implies the asymptotic optimality of designing source and channel coding independently from each other. The result exploits the fact that channel error probabilities can be made arbitrarily small, as long as the block length of the channel code can be made arbitrarily large. However, this is not possible in practice, where the block length is either fixed or restricted to a range of finite values. As a result, the optimality of source and channel coding separation becomes unknown, leading researchers to consider joint source-channel coding (JSCC) to further improve the performance of practical systems that must operate in the finite block length regime. With this motivation, this thesis investigates the application of JSCC principles for multimedia communications over point-to-point, broadcast, and relay channels. All analyses are conducted from the perspective of end-to-end distortion (EED) for results that are applicable to channel codes with finite block lengths in pursuing insights into practical design.
The thesis first revisits the fundamental open problem of the separation of source and channel coding in the finite block length regime. Derived formulations and numerical analyses for a source-channel coding system reveal many scenarios where the EED reduction is positive when pairing the channel-optimized source quantizer (COSQ) with an optimal channel code, hence establishing the invalidity of the separation theorem in the finite block length regime. With this, further improvements to JSCC systems are considered by augmenting error detection codes with the COSQ. Closed-form EED expressions for such system are derived, from which necessary optimality conditions are identified and used in proposed algorithms for system design. Results for both the point-to-point and broadcast channels demonstrate significant reductions to the EED without sacrificing bandwidth when considering a tradeoff between quantization and error detection coding rates. Lastly, the JSCC system is considered under relay channels, for which a computable measure of the EED is derived for any relay channel conditions with nonzero channel error probabilities. To emphasize the importance of analyzing JSCC systems under finite block lengths, the large sub-optimality in performance is demonstrated when solving the power allocation configuration problem according to capacity-based formulations that disregard channel errors, as opposed to those based on the EED.
Although this thesis only considers one JSCC setup of many, it is concluded that consideration of JSCC systems from a non-asymptotic perspective not only is more meaningful, but also reveals more relevant insight into practical system design. This thesis accomplishes such by maintaining the EED as a measure of system performance in each of the considered point-to-point, broadcast, and relay cases
Distributed Deep Joint Source-Channel Coding over a Multiple Access Channel
We consider distributed image transmission over a noisy multiple access
channel (MAC) using deep joint source-channel coding (DeepJSCC). It is known
that Shannon's separation theorem holds when transmitting independent sources
over a MAC in the asymptotic infinite block length regime. However, we are
interested in the practical finite block length regime, in which case separate
source and channel coding is known to be suboptimal. We introduce a novel joint
image compression and transmission scheme, where the devices send their
compressed image representations in a non-orthogonal manner. While
non-orthogonal multiple access (NOMA) is known to achieve the capacity region,
to the best of our knowledge, non-orthogonal joint source channel coding (JSCC)
scheme for practical systems has not been studied before. Through extensive
experiments, we show significant improvements in terms of the quality of the
reconstructed images compared to orthogonal transmission employing current
DeepJSCC approaches particularly for low bandwidth ratios. We publicly share
source code to facilitate further research and reproducibility.Comment: To appear in IEEE International Conference on Communications (ICC)
202
The Reliability Function of Lossy Source-Channel Coding of Variable-Length Codes with Feedback
We consider transmission of discrete memoryless sources (DMSes) across
discrete memoryless channels (DMCs) using variable-length lossy source-channel
codes with feedback. The reliability function (optimum error exponent) is shown
to be equal to where is the rate-distortion
function of the source, is the maximum relative entropy between output
distributions of the DMC, and is the Shannon capacity of the channel. We
show that, in this setting and in this asymptotic regime, separate
source-channel coding is, in fact, optimal.Comment: Accepted to IEEE Transactions on Information Theory in Apr. 201
An Efficient Joint Source-Channel Decoder with Dynamical Block Priors
An efficient joint source-channel (s/c) decoder based on the side information
of the source and on the MN-Gallager algorithm over Galois fields is presented.
The dynamical block priors (DBP) are derived either from a statistical
mechanical approach via calculation of the entropy for the correlated
sequences, or from the Markovian transition matrix. The Markovian joint s/c
decoder has many advantages over the statistical mechanical approach. In
particular, there is no need for the construction and the diagonalization of a
qXq matrix and for a solution to saddle point equations in q dimensions. Using
parametric estimation, an efficient joint s/c decoder with the lack of side
information is discussed. Besides the variant joint s/c decoders presented, we
also show that the available sets of autocorrelations consist of a convex
volume, and its structure can be found using the Simplex algorithm.Comment: 13 pages, to appear in "Progress in Theoretical Physics Supplement",
May 200
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
Joint source-channel coding with feedback
This paper quantifies the fundamental limits of variable-length transmission
of a general (possibly analog) source over a memoryless channel with noiseless
feedback, under a distortion constraint. We consider excess distortion, average
distortion and guaranteed distortion (-semifaithful codes). In contrast to
the asymptotic fundamental limit, a general conclusion is that allowing
variable-length codes and feedback leads to a sizable improvement in the
fundamental delay-distortion tradeoff. In addition, we investigate the minimum
energy required to reproduce source samples with a given fidelity after
transmission over a memoryless Gaussian channel, and we show that the required
minimum energy is reduced with feedback and an average (rather than maximal)
power constraint.Comment: To appear in IEEE Transactions on Information Theor
On the Separation of Lossy Source-Network Coding and Channel Coding in Wireline Networks
This paper proves the separation between source-network coding and channel
coding in networks of noisy, discrete, memoryless channels. We show that the
set of achievable distortion matrices in delivering a family of dependent
sources across such a network equals the set of achievable distortion matrices
for delivering the same sources across a distinct network which is built by
replacing each channel by a noiseless, point-to-point bit-pipe of the
corresponding capacity. Thus a code that applies source-network coding across
links that are made almost lossless through the application of independent
channel coding across each link asymptotically achieves the optimal performance
across the network as a whole.Comment: 5 pages, to appear in the proceedings of 2010 IEEE International
Symposium on Information Theory (ISIT
- …