1,863 research outputs found
R\'enyi Bounds on Information Combining
Bounds on information combining are entropic inequalities that determine how
the information, or entropy, of a set of random variables can change when they
are combined in certain prescribed ways. Such bounds play an important role in
information theory, particularly in coding and Shannon theory. The arguably
most elementary kind of information combining is the addition of two binary
random variables, i.e. a CNOT gate, and the resulting quantities are
fundamental when investigating belief propagation and polar coding. In this
work we will generalize the concept to R\'enyi entropies. We give optimal
bounds on the conditional R\'enyi entropy after combination, based on a certain
convexity or concavity property and discuss when this property indeed holds.
Since there is no generally agreed upon definition of the conditional R\'enyi
entropy, we consider four different versions from the literature. Finally, we
discuss the application of these bounds to the polarization of R\'enyi
entropies under polar codes.Comment: 14 pages, accepted for presentation at ISIT 202
An improved rate region for the classical-quantum broadcast channel
We present a new achievable rate region for the two-user binary-input
classical-quantum broadcast channel. The result is a generalization of the
classical Marton-Gelfand-Pinsker region and is provably larger than the best
previously known rate region for classical-quantum broadcast channels. The
proof of achievability is based on the recently introduced polar coding scheme
and its generalization to quantum network information theory.Comment: 5 pages, double column, 1 figure, based on a result presented in the
Master's thesis arXiv:1501.0373
Bounds on Information Combining With Quantum Side Information
"Bounds on information combining" are entropic inequalities that determine
how the information (entropy) of a set of random variables can change when
these are combined in certain prescribed ways. Such bounds play an important
role in classical information theory, particularly in coding and Shannon
theory; entropy power inequalities are special instances of them. The arguably
most elementary kind of information combining is the addition of two binary
random variables (a CNOT gate), and the resulting quantities play an important
role in Belief propagation and Polar coding. We investigate this problem in the
setting where quantum side information is available, which has been recognized
as a hard setting for entropy power inequalities.
Our main technical result is a non-trivial, and close to optimal, lower bound
on the combined entropy, which can be seen as an almost optimal "quantum Mrs.
Gerber's Lemma". Our proof uses three main ingredients: (1) a new bound on the
concavity of von Neumann entropy, which is tight in the regime of low pairwise
state fidelities; (2) the quantitative improvement of strong subadditivity due
to Fawzi-Renner, in which we manage to handle the minimization over recovery
maps; (3) recent duality results on classical-quantum-channels due to Renes et
al. We furthermore present conjectures on the optimal lower and upper bounds
under quantum side information, supported by interesting analytical
observations and strong numerical evidence.
We finally apply our bounds to Polar coding for binary-input
classical-quantum channels, and show the following three results: (A) Even
non-stationary channels polarize under the polar transform. (B) The blocklength
required to approach the symmetric capacity scales at most sub-exponentially in
the gap to capacity. (C) Under the aforementioned lower bound conjecture, a
blocklength polynomial in the gap suffices.Comment: 23 pages, 6 figures; v2: small correction
Event-Triggered Estimation of Linear Systems: An Iterative Algorithm and Optimality Properties
This report investigates the optimal design of event-triggered estimation for
first-order linear stochastic systems. The problem is posed as a two-player
team problem with a partially nested information pattern. The two players are
given by an estimator and an event-trigger. The event-trigger has full state
information and decides, whether the estimator shall obtain the current state
information by transmitting it through a resource constrained channel. The
objective is to find an optimal trade-off between the mean squared estimation
error and the expected transmission rate. The proposed iterative algorithm
alternates between optimizing one player while fixing the other player. It is
shown that the solution of the algorithm converges to a linear predictor and a
symmetric threshold policy, if the densities of the initial state and the noise
variables are even and radially decreasing functions. The effectiveness of the
approach is illustrated on a numerical example. In case of a multimodal
distribution of the noise variables a significant performance improvement can
be achieved compared to a separate design that assumes a linear prediction and
a symmetric threshold policy
Efficient achievability for quantum protocols using decoupling theorems
Proving achievability of protocols in quantum Shannon theory usually does not
consider the efficiency at which the goal of the protocol can be achieved.
Nevertheless it is known that protocols such as coherent state merging are
efficiently achievable at optimal rate. We aim to investigate this fact further
in a general one-shot setting, by considering certain classes of decoupling
theorems and give exact rates for these classes. Moreover we compare results of
general decoupling theorems using Haar distributed unitaries with those using
smaller sets of operators, in particular -approximate 2-designs. We
also observe the behavior of our rates in special cases such as
approaching zero and the asymptotic limit.Comment: 5 pages, double column, v2: added referenc
- …