2 research outputs found
Second-Order and Moderate Deviation Asymptotics for Successive Refinement
We derive the optimal second-order coding region and moderate deviations
constant for successive refinement source coding with a joint excess-distortion
probability constraint. We consider two scenarios: (i) a discrete memoryless
source (DMS) and arbitrary distortion measures at the decoders and (ii) a
Gaussian memoryless source (GMS) and quadratic distortion measures at the
decoders. For a DMS with arbitrary distortion measures, we prove an achievable
second-order coding region, using type covering lemmas by Kanlis and Narayan
and by No, Ingber and Weissman. We prove the converse using the perturbation
approach by Gu and Effros. When the DMS is successively refinable, the
expressions for the second-order coding region and the moderate deviations
constant are simplified and easily computable. For this case, we also obtain
new insights on the second-order behavior compared to the scenario where
separate excess-distortion proabilities are considered. For example, we
describe a DMS, for which the optimal second-order region transitions from
being characterizable by a bivariate Gaussian to a univariate Gaussian, as the
distortion levels are varied. We then consider a GMS with quadratic distortion
measures. To prove the direct part, we make use of the sphere covering theorem
by Verger-Gaugry, together with appropriately-defined Gaussian type classes. To
prove the converse, we generalize Kostina and Verd\'u's one-shot converse bound
for point-to-point lossy source coding. We remark that this proof is applicable
to general successively refinable sources. In the proofs of the moderate
deviations results for both scenarios, we follow a strategy similar to that for
the second-order asymptotics and use the moderate deviations principle.Comment: Part of this paper has beed presented at ISIT 2016. Submitted to IEEE
Transactions on Information Theory in Jan, 2016. Revised in Aug. 201
Non-Asymptotic Converse Bounds and Refined Asymptotics for Two Lossy Source Coding Problems
In this paper, we revisit two multi-terminal lossy source coding problems:
the lossy source coding problem with side information available at the encoder
and one of the two decoders, which we term as the Kaspi problem (Kaspi, 1994),
and the multiple description coding problem with one semi-deterministic
distortion measure, which we refer to as the Fu-Yeung problem (Fu and Yeung,
2002). For the Kaspi problem, we first present the properties of optimal test
channels. Subsequently, we generalize the notion of the distortion-tilted
information density for the lossy source coding problem to the Kaspi problem
and prove a non-asymptotic converse bound using the properties of optimal test
channels and the well-defined distortion-tilted information density. Finally,
for discrete memoryless sources, we derive refined asymptotics which includes
the second-order, large and moderate deviations asymptotics. In the converse
proof of second-order asymptotics, we apply the Berry-Esseen theorem to the
derived non-asymptotic converse bound. The achievability proof follows by first
proving a type-covering lemma tailored to the Kaspi problem, then properly
Taylor expanding the well-defined distortion-tilted information densities and
finally applying the Berry-Esseen theorem. We then generalize the methods used
in the Kaspi problem to the Fu-Yeung problem. As a result, we obtain the
properties of optimal test channels for the minimum sum-rate function, a
non-asymptotic converse bound and refined asymptotics for discrete memoryless
sources. Since the successive refinement problem is a special case of the
Fu-Yeung problem, as a by-product, we obtain a non-asymptotic converse bound
for the successive refinement problem, which is a strict generalization of the
non-asymptotic converse bound for successively refinable sources (Zhou, Tan and
Motani, 2017).Comment: 34 pages, to be submitted to IEEE Transactions on Information Theory,
extended version of two papers accepted by Globecom 201