5 research outputs found
Partial Information Decomposition via Deficiency for Multivariate Gaussians
Bivariate partial information decompositions (PIDs) characterize how the
information in a "message" random variable is decomposed between two
"constituent" random variables in terms of unique, redundant and synergistic
information components. These components are a function of the joint
distribution of the three variables, and are typically defined using an
optimization over the space of all possible joint distributions. This makes it
computationally challenging to compute PIDs in practice and restricts their use
to low-dimensional random vectors. To ease this burden, we consider the case of
jointly Gaussian random vectors in this paper. This case was previously
examined by Barrett (2015), who showed that certain operationally
well-motivated PIDs reduce to a closed form expression for scalar messages.
Here, we show that Barrett's result does not extend to vector messages in
general, and characterize the set of multivariate Gaussian distributions that
reduce to closed-form. Then, for all other multivariate Gaussian distributions,
we propose a convex optimization framework for approximately computing a
specific PID definition based on the statistical concept of deficiency. Using
simplifying assumptions specific to the Gaussian case, we provide an efficient
algorithm to approximately compute the bivariate PID for multivariate Gaussian
variables with tens or even hundreds of dimensions. We also theoretically and
empirically justify the goodness of this approximation.Comment: Presented at ISIT 2022. This version has been updated to reflect the
final conference publication, including appendices. It also corrects
technical errors in Remark 1 and Appendix C, adds a new experiment, and has a
substantially improved presentation as well as additional detail in the
appendix, compared to the previous arxiv versio
Capacity Region of Vector Gaussian Interference Channels with Generally Strong Interference
An interference channel is said to have strong interference if for all input
distributions, the receivers can fully decode the interference. This definition
of strong interference applies to discrete memoryless, scalar and vector
Gaussian interference channels. However, there exist vector Gaussian
interference channels that may not satisfy the strong interference condition
but for which the capacity can still be achieved by jointly decoding the signal
and the interference. This kind of interference is called generally strong
interference. Sufficient conditions for a vector Gaussian interference channel
to have generally strong interference are derived. The sum-rate capacity and
the boundary points of the capacity region are also determined.Comment: 50 pages, 11 figures, submitted to IEEE trans. on Information Theor
Recommended from our members
Noisy-Interference Sum-Rate Capacity for Vector Gaussian Interference Channels
New sufficient conditions for a vector Gaussian interference channel to achieve the sum-rate capacity by treating interference as noise are derived, which generalize and extend the existing results. More concise conditions for multiple-input single-output, and single-input multiple-output scenarios are obtained
Recommended from our members
Noisy-Interference Sum-Rate Capacity for Vector Gaussian Interference Channels
New sufficient conditions for a vector Gaussian interference channel to achieve the sum-rate capacity by treating interference as noise are derived, which generalize and extend the existing results. More concise conditions for multiple-input single-output, and single-input multiple-output scenarios are obtained