1 research outputs found

    On Communication through a Gaussian Channel with an MMSE Disturbance Constraint

    Full text link
    This paper considers a Gaussian channel with one transmitter and two receivers. The goal is to maximize the communication rate at the intended/primary receiver subject to a disturbance constraint at the unintended/secondary receiver. The disturbance is measured in terms of minimum mean square error (MMSE) of the interference that the transmission to the primary receiver inflicts on the secondary receiver. The paper presents a new upper bound for the problem of maximizing the mutual information subject to an MMSE constraint. The new bound holds for vector inputs of any length and recovers a previously known limiting (when the length of vector input tends to infinity) expression from the work of Bustin et al.\textit{et al.} The key technical novelty is a new upper bound on the MMSE. This bound allows one to bound the MMSE for all signal-to-noise ratio (SNR) values below\textit{below} a certain SNR at which the MMSE is known (which corresponds to the disturbance constraint). This bound complements the `single-crossing point property' of the MMSE that upper bounds the MMSE for all SNR values above\textit{above} a certain value at which the MMSE value is known. The MMSE upper bound provides a refined characterization of the phase-transition phenomenon which manifests, in the limit as the length of the vector input goes to infinity, as a discontinuity of the MMSE for the problem at hand. For vector inputs of size n=1n=1, a matching lower bound, to within an additive gap of order O(loglog1MMSE)O \left( \log \log \frac{1}{\sf MMSE} \right) (where MMSE{\sf MMSE} is the disturbance constraint), is shown by means of the mixed inputs technique recently introduced by Dytso et al.\textit{et al.}Comment: Submitted to IEEE Transactions on Information Theor
    corecore