8 research outputs found

    Caching and Coded Multicasting: Multiple Groupcast Index Coding

    Full text link
    The capacity of caching networks has received considerable attention in the past few years. A particularly studied setting is the case of a single server (e.g., a base station) and multiple users, each of which caches segments of files in a finite library. Each user requests one (whole) file in the library and the server sends a common coded multicast message to satisfy all users at once. The problem consists of finding the smallest possible codeword length to satisfy such requests. In this paper we consider the generalization to the case where each user places L≥1L \geq 1 requests. The obvious naive scheme consists of applying LL times the order-optimal scheme for a single request, obtaining a linear in LL scaling of the multicast codeword length. We propose a new achievable scheme based on multiple groupcast index coding that achieves a significant gain over the naive scheme. Furthermore, through an information theoretic converse we find that the proposed scheme is approximately optimal within a constant factor of (at most) 1818.Comment: 5 pages, 1 figure, to appear in GlobalSIP14, Dec. 201

    Linear Codes are Optimal for Index-Coding Instances with Five or Fewer Receivers

    Full text link
    We study zero-error unicast index-coding instances, where each receiver must perfectly decode its requested message set, and the message sets requested by any two receivers do not overlap. We show that for all these instances with up to five receivers, linear index codes are optimal. Although this class contains 9847 non-isomorphic instances, by using our recent results and by properly categorizing the instances based on their graphical representations, we need to consider only 13 non-trivial instances to solve the entire class. This work complements the result by Arbabjolfaei et al. (ISIT 2013), who derived the capacity region of all unicast index-coding problems with up to five receivers in the diminishing-error setup. They employed random-coding arguments, which require infinitely-long messages. We consider the zero-error setup; our approach uses graph theory and combinatorics, and does not require long messages.Comment: submitted to the 2014 IEEE International Symposium on Information Theory (ISIT

    A New Class of Index Coding Instances Where Linear Coding is Optimal

    Full text link
    We study index-coding problems (one sender broadcasting messages to multiple receivers) where each message is requested by one receiver, and each receiver may know some messages a priori. This type of index-coding problems can be fully described by directed graphs. The aim is to find the minimum codelength that the sender needs to transmit in order to simultaneously satisfy all receivers' requests. For any directed graph, we show that if a maximum acyclic induced subgraph (MAIS) is obtained by removing two or fewer vertices from the graph, then the minimum codelength (i.e., the solution to the index-coding problem) equals the number of vertices in the MAIS, and linear codes are optimal for this index-coding problem. Our result increases the set of index-coding problems for which linear index codes are proven to be optimal.Comment: accepted and to be presented at the 2014 International Symposium on Network Coding (NetCod

    The Single-Uniprior Index-Coding Problem: The Single-Sender Case and The Multi-Sender Extension

    Full text link
    Index coding studies multiterminal source-coding problems where a set of receivers are required to decode multiple (possibly different) messages from a common broadcast, and they each know some messages a priori. In this paper, at the receiver end, we consider a special setting where each receiver knows only one message a priori, and each message is known to only one receiver. At the broadcasting end, we consider a generalized setting where there could be multiple senders, and each sender knows a subset of the messages. The senders collaborate to transmit an index code. This work looks at minimizing the number of total coded bits the senders are required to transmit. When there is only one sender, we propose a pruning algorithm to find a lower bound on the optimal (i.e., the shortest) index codelength, and show that it is achievable by linear index codes. When there are two or more senders, we propose an appending technique to be used in conjunction with the pruning technique to give a lower bound on the optimal index codelength; we also derive an upper bound based on cyclic codes. While the two bounds do not match in general, for the special case where no two distinct senders know any message in common, the bounds match, giving the optimal index codelength. The results are expressed in terms of strongly connected components in directed graphs that represent the index-coding problems.Comment: Author final manuscrip

    Duality Codes and the Integrality Gap Bound for Index Coding

    No full text
    Abstract—This paper considers a base station that delivers packets to multiple receivers through a sequence of coded transmissions. All receivers overhear the same transmissions. Each receiver may already have some of the packets as side information, and requests another subset of the packets. This problem is known as the index coding problem and can be represented by a bipartite digraph. An integer linear program is developed that provides a lower bound on the minimum number of transmissions required for any coding algorithm. Conversely, its linear programming relaxation is shown to provide an upper bound that is achievable by a simple form of vector linear coding. Thus, the information theoretic optimum is bounded by the integrality gap between the integer program and its linear relaxation. In the special case when the digraph has a planar structure, the integrality gap is shown to be zero, so that exact optimality is achieved. This work illuminates the relationship between index coding, duality, and integrality gaps between integer programs and their linear relaxations. I
    corecore