12 research outputs found

    Minimum Cost Distributed Source Coding Over a Network

    Get PDF
    This paper considers the problem of transmitting multiple compressible sources over a network at minimum cost. The aim is to find the optimal rates at which the sources should be compressed and the network flows using which they should be transmitted so that the cost of the transmission is minimal. We consider networks with capacity constraints and linear cost functions. The problem is complicated by the fact that the description of the feasible rate region of distributed source coding problems typically has a number of constraints that is exponential in the number of sources. This renders general purpose solvers inefficient. We present a framework in which these problems can be solved efficiently by exploiting the structure of the feasible rate regions coupled with dual decomposition and optimization techniques such as the subgradient method and the proximal bundle method

    Cores of Cooperative Games in Information Theory

    Get PDF
    Cores of cooperative games are ubiquitous in information theory, and arise most frequently in the characterization of fundamental limits in various scenarios involving multiple users. Examples include classical settings in network information theory such as Slepian-Wolf source coding and multiple access channels, classical settings in statistics such as robust hypothesis testing, and new settings at the intersection of networking and statistics such as distributed estimation problems for sensor networks. Cooperative game theory allows one to understand aspects of all of these problems from a fresh and unifying perspective that treats users as players in a game, sometimes leading to new insights. At the heart of these analyses are fundamental dualities that have been long studied in the context of cooperative games; for information theoretic purposes, these are dualities between information inequalities on the one hand and properties of rate, capacity or other resource allocation regions on the other.Comment: 12 pages, published at http://www.hindawi.com/GetArticle.aspx?doi=10.1155/2008/318704 in EURASIP Journal on Wireless Communications and Networking, Special Issue on "Theory and Applications in Multiuser/Multiterminal Communications", April 200

    Minimum cost mirror sites using network coding: Replication vs. coding at the source nodes

    Get PDF
    Content distribution over networks is often achieved by using mirror sites that hold copies of files or portions thereof to avoid congestion and delay issues arising from excessive demands to a single location. Accordingly, there are distributed storage solutions that divide the file into pieces and place copies of the pieces (replication) or coded versions of the pieces (coding) at multiple source nodes. We consider a network which uses network coding for multicasting the file. There is a set of source nodes that contains either subsets or coded versions of the pieces of the file. The cost of a given storage solution is defined as the sum of the storage cost and the cost of the flows required to support the multicast. Our interest is in finding the storage capacities and flows at minimum combined cost. We formulate the corresponding optimization problems by using the theory of information measures. In particular, we show that when there are two source nodes, there is no loss in considering subset sources. For three source nodes, we derive a tight upper bound on the cost gap between the coded and uncoded cases. We also present algorithms for determining the content of the source nodes.Comment: IEEE Trans. on Information Theory (to appear), 201

    Selfish Distributed Compression Over Networks: Correlation Induces Anarchy

    Get PDF
    We consider the min-cost multicast problem (under network coding) with multiple correlated sources where each terminal wants to losslessly reconstruct all the sources. We study the inefficiency brought forth by the selfish behavior of the terminals in this scenario by modeling it as a noncooperative game among the terminals. The degradation in performance due to the lack of regulation is measured by the Price of Anarchy (POA), which is defined as the ratio between the cost of the worst possible Wardrop equilibrium and the socially optimum cost. Our main result is that in contrast with the case of independent sources, the presence of source correlations can significantly increase the price of anarchy. Toward establishing this result, we first characterize the socially optimal flow and rate allocation in terms of four intuitive conditions. Next, we show that the Wardrop equilibrium is a socially optimal solution for a different set of (related) cost functions. Using this, we construct explicit examples that demonstrate that the POA \u3e; 1 and determine near-tight upper bounds on the POA as well. The main techniques in our analysis are Lagrangian duality theory and the usage of the supermodularity of conditional entropy

    Achievable schemes for cost/performance trade-offs in networks

    Get PDF
    A common pattern in communication networks (both wired and wireless) is the collection of distributed state information from various network elements. This network state is needed for both analytics and operator policy and its collection consumes network resources, both to measure the relevant state and to transmit the measurements back to the data sink. The design of simple achievable schemes are considered with the goal of minimizing the overhead from data collection and/or trading off performance for overhead. Where possible, these schemes are compared with the optimal trade-off curve. The optimal transmission of distributed correlated discrete memoryless sources across a network with capacity constraints is considered first. Previously unreported properties of jointly optimal compression rates and transmission schemes are established. Additionally, an explicit relationship between the conditional independence relationships of the distributed sources and the number of vertices for the Slepian-Wolf rate region is given. Motivated by recent work applying rate-distortion theory to computing the optimal performance-overhead trade-off, the use of distributed scalar quantization is investigated for lossy encoding of state, where a central estimation officer (CEO) wishes to compute an extremization function of a collection of sources. The superiority of a simple heterogeneous (across users) quantizer design over the optimal homogeneous quantizer design is proven. Interactive communication enables an alternative framework where communicating parties can send messages back-and-forth over multiple rounds. This back-and-forth messaging can reduce the rate required to compute an extremum/extrema of the sources at the cost of increased delay. Again scalar quantization followed by entropy encoding is considered as an achievable scheme for a collection of distributed users talking to a CEO in the context of interactive communication. The design of optimal quantizers is formulated as the solution of a minimum cost dynamic program. It is established that, asymptotically, the costs for the CEO to compute the different extremization functions are equal. The existence of a simpler search space, which is asymptotically sufficient for minimizing the cost of computing the selected extremization functions, is proven.Ph.D., Electrical Engineering -- Drexel University, 201

    Coded caching: Information theoretic bounds and asynchronism

    Get PDF
    Caching is often used in content delivery networks as a mechanism for reducing network traffic. Recently, the technique of coded caching was introduced whereby coding in the caches and coded transmission signals from the central server were considered. Prior results in this area demonstrate that carefully designing the placement of content in the caches and designing appropriate coded delivery signals from the server allow for a system where the delivery rates can be significantly smaller than conventional schemes. However, matching upper and lower bounds on the transmission rate have not yet been obtained. In the first part of this thesis we derive tighter lower bounds on the coded caching rate than were known previously. We demonstrate that this problem can equivalently be posed as a combinatorial problem of optimally labeling the leaves of a directed tree. Our proposed labeling algorithm allows for significantly improved lower bounds on the coded caching rate. Furthermore, we study certain structural properties of our algorithm that allow us to analytically quantify improvements on the rate lower bound for general values of the problem parameters. This allows us to obtain a multiplicative gap of at most four between the achievable rate and our lower bound. The original formulation of the coded caching problem assumes that the file requests from the users are synchronized, i.e., they arrive at the server at the same time. Several subsequent contributions work under the same assumption. Furthermore, the majority of prior work does not consider a scenario where users have deadlines. In the second part of this thesis we formulate the asynchronous coded caching problem where user requests arrive at different times. Furthermore, the users have specified deadlines. We propose a linear program for obtaining its optimal solution. However, the size of the LP (number of constraints and variables) grows rather quickly with the number of users and cache sizes. To deal with this problem, we explore a dual decomposition based approach for solving the LP under consideration. We demonstrate that the dual function can be evaluated by equivalently solving a number of minimum cost network flow algorithms. Moreover, we consider the asynchronous setting where the file requests are revealed to the server in an online fashion. We propose a novel online algorithm for this problem building on our prior work for the offline setting (where the server knows the request arrival times and deadlines in advance). Our simulation results demonstrate that our proposed online algorithm allows for a natural tradeoff between the feasibility of the schedule and the rate gains of coded caching

    Minimum cost distributed source coding over a network

    No full text
    This work considers the problem of transmitting multiple compressible sources over a network with minimum cost. The problem is complicated by the fact that the description of the feasible rate region of distributed source coding problems typically has a number of constraints that is exponential in the number of sources that renders general purpose solvers inefficient. We present a framework in which these problems can be solved efficiently by exploiting the structure of the feasible rate regions coupled with dual decomposition and subgradient methods.This is a manuscript of a proceeding from the IEEE International Symposium on Information Theory (2007): 1761, doi:10.1109/ISIT.2007.4557476. Posted with permission.</p
    corecore