310 research outputs found

    Source Coding Problems with Conditionally Less Noisy Side Information

    Full text link
    A computable expression for the rate-distortion (RD) function proposed by Heegard and Berger has eluded information theory for nearly three decades. Heegard and Berger's single-letter achievability bound is well known to be optimal for \emph{physically degraded} side information; however, it is not known whether the bound is optimal for arbitrarily correlated side information (general discrete memoryless sources). In this paper, we consider a new setup in which the side information at one receiver is \emph{conditionally less noisy} than the side information at the other. The new setup includes degraded side information as a special case, and it is motivated by the literature on degraded and less noisy broadcast channels. Our key contribution is a converse proving the optimality of Heegard and Berger's achievability bound in a new setting. The converse rests upon a certain \emph{single-letterization} lemma, which we prove using an information theoretic telescoping identity {recently presented by Kramer}. We also generalise the above ideas to two different successive-refinement problems

    Privacy-Utility Management of Hypothesis Tests

    Full text link
    The trade-off of hypothesis tests on the correlated privacy hypothesis and utility hypothesis is studied. The error exponent of the Bayesian composite hypothesis test on the privacy or utility hypothesis can be characterized by the corresponding minimal Chernoff information rate. An optimal management protects the privacy by minimizing the error exponent of the privacy hypothesis test and meanwhile guarantees the utility hypothesis testing performance by satisfying a lower bound on the corresponding minimal Chernoff information rate. The asymptotic minimum error exponent of the privacy hypothesis test is shown to be characterized by the infimum of corresponding minimal Chernoff information rates subject to the utility guarantees.Comment: accepted in IEEE Information Theory Workshop 201

    Lossy Source Coding with Reconstruction Privacy

    Full text link
    We consider the problem of lossy source coding with side information under a privacy constraint that the reconstruction sequence at a decoder should be kept secret to a certain extent from another terminal such as an eavesdropper, a sender, or a helper. We are interested in how the reconstruction privacy constraint at a particular terminal affects the rate-distortion tradeoff. In this work, we allow the decoder to use a random mapping, and give inner and outer bounds to the rate-distortion-equivocation region for different cases where the side information is available non-causally and causally at the decoder. In the special case where each reconstruction symbol depends only on the source description and current side information symbol, the complete rate-distortion-equivocation region is provided. A binary example illustrating a new tradeoff due to the new privacy constraint, and a gain from the use of a stochastic decoder is given.Comment: 22 pages, added proofs, to be presented at ISIT 201

    Stabilization of Linear Systems Over Gaussian Networks

    Full text link
    The problem of remotely stabilizing a noisy linear time invariant plant over a Gaussian relay network is addressed. The network is comprised of a sensor node, a group of relay nodes and a remote controller. The sensor and the relay nodes operate subject to an average transmit power constraint and they can cooperate to communicate the observations of the plant's state to the remote controller. The communication links between all nodes are modeled as Gaussian channels. Necessary as well as sufficient conditions for mean-square stabilization over various network topologies are derived. The sufficient conditions are in general obtained using delay-free linear policies and the necessary conditions are obtained using information theoretic tools. Different settings where linear policies are optimal, asymptotically optimal (in certain parameters of the system) and suboptimal have been identified. For the case with noisy multi-dimensional sources controlled over scalar channels, it is shown that linear time varying policies lead to minimum capacity requirements, meeting the fundamental lower bound. For the case with noiseless sources and parallel channels, non-linear policies which meet the lower bound have been identified

    On the Entropy Computation of Large Complex Gaussian Mixture Distributions

    Full text link
    The entropy computation of Gaussian mixture distributions with a large number of components has a prohibitive computational complexity. In this paper, we propose a novel approach exploiting the sphere decoding concept to bound and approximate such entropy terms with reduced complexity and good accuracy. Moreover, we propose an SNR region based enhancement of the approximation method to reduce the complexity even further. Using Monte-Carlo simulations, the proposed methods are numerically demonstrated for the computation of the mutual information including the entropy term of various channels with finite constellation modulations such as binary and quadratic amplitude modulation (QAM) inputs for communication applications.Comment: 14 pages, Accepted to IEEE Transactions on Signal Processin

    Low Complexity Scalable Iterative Algorithms for IEEE 802.11p Receivers

    Get PDF
    In this paper, we investigate receivers for Vehicular to Vehicular (V2V) and Vehicular to Infrastructure (V2I) communications. Vehicular channels are characterized by multiple paths and time variations, which introduces challenges in the design of receivers. We propose an algorithm for IEEE 802.11p compliant receivers, based on Orthogonal Frequency Division Multiplexing (OFDM). We employ iterative structures in the receiver as a way to estimate the channel despite variations within a frame. The channel estimator is based on factor graphs, which allow the design of soft iterative receivers while keeping an acceptable computational complexity. Throughout this work, we focus on designing a receiver offering a good complexity performance trade-off. Moreover, we propose a scalable algorithm in order to be able to tune the trade-off depending on the channel conditions. Our algorithm allows reliable communications while offering a considerable decrease in computational complexity. In particular, numerical results show the trade-off between complexity and performance measured in computational time and BER as well as FER achieved by various interpolation lengths used by the estimator which both outperform by decades the standard least square solution. Furthermore our adaptive algorithm shows a considerable improvement in terms of computational time and complexity against state of the art and classical receptors whilst showing acceptable BER and FER performance

    Massive MIMO Pilot Retransmission Strategies for Robustification against Jamming

    Get PDF
    This letter proposes anti-jamming strategies based on pilot retransmission for a single user uplink massive MIMO under jamming attack. A jammer is assumed to attack the system both in the training and data transmission phases. We first derive an achievable rate which enables us to analyze the effect of jamming attacks on the system performance. Counter-attack strategies are then proposed to mitigate this effect under two different scenarios: random and deterministic jamming attacks. Numerical results illustrate our analysis and benefit of the proposed schemes
    corecore