287 research outputs found

    Performance Evaluation of Multiterminal Backhaul Compression for Cloud Radio Access Networks

    Full text link
    In cloud radio access networks (C-RANs), the baseband processing of the available macro- or pico/femto-base stations (BSs) is migrated to control units, each of which manages a subset of BS antennas. The centralized information processing at the control units enables effective interference management. The main roadblock to the implementation of C-RANs hinges on the effective integration of the radio units, i.e., the BSs, with the backhaul network. This work first reviews in a unified way recent results on the application of advanced multiterminal, as opposed to standard point-to-point, backhaul compression techniques. The gains provided by multiterminal backhaul compression are then confirmed via extensive simulations based on standard cellular models. As an example, it is observed that multiterminal compression strategies provide performance gains of more than 60% for both the uplink and the downlink in terms of the cell-edge throughput.Comment: A shorter version of the paper has been submitted to CISS 201

    Information Masking and Amplification: The Source Coding Setting

    Full text link
    The complementary problems of masking and amplifying channel state information in the Gel'fand-Pinsker channel have recently been solved by Merhav and Shamai, and Kim et al., respectively. In this paper, we study a related source coding problem. Specifically, we consider the two-encoder source coding setting where one source is to be amplified, while the other source is to be masked. In general, there is a tension between these two objectives which is characterized by the amplification-masking tradeoff. In this paper, we give a single-letter description of this tradeoff. We apply this result, together with a recent theorem by Courtade and Weissman on multiterminal source coding, to solve a fundamental entropy characterization problem.Comment: 6 pages, 1 figure, to appear at the IEEE 2012 International Symposium on Information Theory (ISIT 2012

    The CEO Problem with Secrecy Constraints

    Full text link
    We study a lossy source coding problem with secrecy constraints in which a remote information source should be transmitted to a single destination via multiple agents in the presence of a passive eavesdropper. The agents observe noisy versions of the source and independently encode and transmit their observations to the destination via noiseless rate-limited links. The destination should estimate the remote source based on the information received from the agents within a certain mean distortion threshold. The eavesdropper, with access to side information correlated to the source, is able to listen in on one of the links from the agents to the destination in order to obtain as much information as possible about the source. This problem can be viewed as the so-called CEO problem with additional secrecy constraints. We establish inner and outer bounds on the rate-distortion-equivocation region of this problem. We also obtain the region in special cases where the bounds are tight. Furthermore, we study the quadratic Gaussian case and provide the optimal rate-distortion-equivocation region when the eavesdropper has no side information and an achievable region for a more general setup with side information at the eavesdropper.Comment: Accepted for publication in IEEE Transactions on Information Forensics and Security, 17 pages, 4 figure

    Compressed Secret Key Agreement: Maximizing Multivariate Mutual Information Per Bit

    Full text link
    The multiterminal secret key agreement problem by public discussion is formulated with an additional source compression step where, prior to the public discussion phase, users independently compress their private sources to filter out strongly correlated components for generating a common secret key. The objective is to maximize the achievable key rate as a function of the joint entropy of the compressed sources. Since the maximum achievable key rate captures the total amount of information mutual to the compressed sources, an optimal compression scheme essentially maximizes the multivariate mutual information per bit of randomness of the private sources, and can therefore be viewed more generally as a dimension reduction technique. Single-letter lower and upper bounds on the maximum achievable key rate are derived for the general source model, and an explicit polynomial-time computable formula is obtained for the pairwise independent network model. In particular, the converse results and the upper bounds are obtained from those of the related secret key agreement problem with rate-limited discussion. A precise duality is shown for the two-user case with one-way discussion, and such duality is extended to obtain the desired converse results in the multi-user case. In addition to posing new challenges in information processing and dimension reduction, the compressed secret key agreement problem helps shed new light on resolving the difficult problem of secret key agreement with rate-limited discussion, by offering a more structured achieving scheme and some simpler conjectures to prove

    Distributed Hypothesis Testing with Privacy Constraints

    Full text link
    We revisit the distributed hypothesis testing (or hypothesis testing with communication constraints) problem from the viewpoint of privacy. Instead of observing the raw data directly, the transmitter observes a sanitized or randomized version of it. We impose an upper bound on the mutual information between the raw and randomized data. Under this scenario, the receiver, which is also provided with side information, is required to make a decision on whether the null or alternative hypothesis is in effect. We first provide a general lower bound on the type-II exponent for an arbitrary pair of hypotheses. Next, we show that if the distribution under the alternative hypothesis is the product of the marginals of the distribution under the null (i.e., testing against independence), then the exponent is known exactly. Moreover, we show that the strong converse property holds. Using ideas from Euclidean information theory, we also provide an approximate expression for the exponent when the communication rate is low and the privacy level is high. Finally, we illustrate our results with a binary and a Gaussian example

    Network Information Flow with Correlated Sources

    Full text link
    In this paper, we consider a network communications problem in which multiple correlated sources must be delivered to a single data collector node, over a network of noisy independent point-to-point channels. We prove that perfect reconstruction of all the sources at the sink is possible if and only if, for all partitions of the network nodes into two subsets S and S^c such that the sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c} C_{ij}. Our main finding is that in this setup a general source/channel separation theorem holds, and that Shannon information behaves as a classical network flow, identical in nature to the flow of water in pipes. At first glance, it might seem surprising that separation holds in a fairly general network situation like the one we study. A closer look, however, reveals that the reason for this is that our model allows only for independent point-to-point channels between pairs of nodes, and not multiple-access and/or broadcast channels, for which separation is well known not to hold. This ``information as flow'' view provides an algorithmic interpretation for our results, among which perhaps the most important one is the optimality of implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information Theory -- contains (very) minor changes based on the last round of review
    corecore