262,313 research outputs found

    Correlated Sources In Distributed Networks - Data Transmission, Common Information Characterization and Inferencing

    Get PDF
    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of dependence among observations. It is well known that separated source and channel coding is optimal for point-to-point communication. However, this is not the case for multi-terminal communications. In this thesis, we study the problem of communicating correlated sources over interference channels (IC), for both the lossless and the lossy case. For lossless case, a sufficient condition is found using the technique of random source partition and correlation preserving codeword generation. The sufficient condition reduces to the Han-Kobayashi achievable rate region for IC with independent observations. Moreover, the proposed coding scheme is optimal for transmitting a special correlated sources over a class of deterministic interference channels. We then study the general case of lossy transmission of two correlated sources over a two-user discrete memoryless interference channel (DMIC). An achievable distortion region is obtained and Gaussian examples are studied. The second topic is the generalization of Wyner\u27s definition of common information of a pair of random variables to that of N random variables. Coding theorems are obtained to show that the same operational meanings for the common information of two random variables apply to that of N random variables. We establish a monotone property of Wyner\u27s common information which is in contrast to other notions of the common information, specifically Shannon\u27s mutual information and G\u27{a}cs and K {o}rner\u27s common randomness. Later, we extend Wyner\u27s common information to that of continuous random variables and provide an operational meaning using the Gray-Wyner network with lossy source coding. We show that Wyner\u27s common information equals the smallest common message rate when the total rate is arbitrarily close to the rate-distortion function with joint decoding. Finally, we consider the problem of distributed test of statistical independence under communication constraints. Focusing on the Gaussian case because of its tractability, we study in this thesis the characteristics of optimal scalar quantizers for distributed test of independence where the optimality is both in the finite sample regime and in the asymptotic regime

    A new achievable rate region for interference channels with common information

    Get PDF
    In this paper, a new achievable rate region for general interference channels with common information is presented. Our result improves upon [1] by applying simultaneous superposition coding over sequential superposition coding. A detailed computation and comparison of the achievable rate region for the Gaussian case is conducted. The proposed achievable rate region is shown to coincide with the capacity region of the strong interference case [2]

    Accessible Capacity of Secondary Users

    Full text link
    A new problem formulation is presented for the Gaussian interference channels (GIFC) with two pairs of users, which are distinguished as primary users and secondary users, respectively. The primary users employ a pair of encoder and decoder that were originally designed to satisfy a given error performance requirement under the assumption that no interference exists from other users. In the scenario when the secondary users attempt to access the same medium, we are interested in the maximum transmission rate (defined as {\em accessible capacity}) at which secondary users can communicate reliably without affecting the error performance requirement by the primary users under the constraint that the primary encoder (not the decoder) is kept unchanged. By modeling the primary encoder as a generalized trellis code (GTC), we are then able to treat the secondary link and the cross link from the secondary transmitter to the primary receiver as finite state channels (FSCs). Based on this, upper and lower bounds on the accessible capacity are derived. The impact of the error performance requirement by the primary users on the accessible capacity is analyzed by using the concept of interference margin. In the case of non-trivial interference margin, the secondary message is split into common and private parts and then encoded by superposition coding, which delivers a lower bound on the accessible capacity. For some special cases, these bounds can be computed numerically by using the BCJR algorithm. Numerical results are also provided to gain insight into the impacts of the GTC and the error performance requirement on the accessible capacity.Comment: 42 pages, 12 figures, 2 tables; Submitted to IEEE Transactions on Information Theory on December, 2010, Revised on November, 201

    Secret-key Agreement with Channel State Information at the Transmitter

    Full text link
    We study the capacity of secret-key agreement over a wiretap channel with state parameters. The transmitter communicates to the legitimate receiver and the eavesdropper over a discrete memoryless wiretap channel with a memoryless state sequence. The transmitter and the legitimate receiver generate a shared secret key, that remains secret from the eavesdropper. No public discussion channel is available. The state sequence is known noncausally to the transmitter. We derive lower and upper bounds on the secret-key capacity. The lower bound involves constructing a common state reconstruction sequence at the legitimate terminals and binning the set of reconstruction sequences to obtain the secret-key. For the special case of Gaussian channels with additive interference (secret-keys from dirty paper channel) our bounds differ by 0.5 bit/symbol and coincide in the high signal-to-noise-ratio and high interference-to-noise-ratio regimes. For the case when the legitimate receiver is also revealed the state sequence, we establish that our lower bound achieves the the secret-key capacity. In addition, for this special case, we also propose another scheme that attains the capacity and requires only causal side information at the transmitter and the receiver.Comment: 10 Pages, Submitted to IEEE Transactions on Information Forensics and Security, Special Issue on Using the Physical Layer for Securing the Next Generation of Communication System
    corecore