1,278 research outputs found

    An Efficient Coded Multicasting Scheme Preserving the Multiplicative Caching Gain

    Full text link
    Coded multicasting has been shown to be a promis- ing approach to significantly improve the caching performance of content delivery networks with multiple caches downstream of a common multicast link. However, achievable schemes proposed to date have been shown to achieve the proved order-optimal performance only in the asymptotic regime in which the number of packets per requested item goes to infinity. In this paper, we first extend the asymptotic analysis of the achievable scheme in [1], [2] to the case of heterogeneous cache sizes and demand distributions, providing the best known upper bound on the fundamental limiting performance when the number of packets goes to infinity. We then show that the scheme achieving this upper bound quickly loses its multiplicative caching gain for finite content packetization. To overcome this limitation, we design a novel polynomial-time algorithm based on random greedy graph- coloring that, while keeping the same finite content packetization, recovers a significant part of the multiplicative caching gain. Our results show that the order-optimal coded multicasting schemes proposed to date, while useful in quantifying the fundamental limiting performance, must be properly designed for practical regimes of finite packetization.Comment: 6 pages, 7 figures, Published in Infocom CNTCV 201

    Dynamic Edge Caching with Popularity Drifting

    Full text link
    Caching at the network edge devices such as wireless caching stations (WCS) is a key technology in the 5G network. The spatial-temporal diversity of content popularity requires different content to be cached in different WCSs and periodically updated to adapt to temporal changes. In this paper, we study how the popularity drifting speed affects the number of required broadcast transmissions by the MBS and then design coded transmission schemes by leveraging the broadcast advantage under the index coding framework. The key idea is that files already cached in WCSs, which although may be currently unpopular, can serve as side information to facilitate coded broadcast transmission for cache updating. Our algorithm extends existing index coding-based schemes from a single-request scenario to a multiple-request scenario via a "dynamic coloring" approach. Simulation results indicate that a significant bandwidth saving can be achieved by adopting our scheme

    Cache-Aided Coded Multicast for Correlated Sources

    Full text link
    The combination of edge caching and coded multicasting is a promising approach to improve the efficiency of content delivery over cache-aided networks. The global caching gain resulting from content overlap distributed across the network in current solutions is limited due to the increasingly personalized nature of the content consumed by users. In this paper, the cache-aided coded multicast problem is generalized to account for the correlation among the network content by formulating a source compression problem with distributed side information. A correlation-aware achievable scheme is proposed and an upper bound on its performance is derived. It is shown that considerable load reductions can be achieved, compared to state of the art correlation-unaware schemes, when caching and delivery phases specifically account for the correlation among the content files.Comment: In proceeding of IEEE International Symposium on Turbo Codes and Iterative Information Processing (ISTC), 201

    Finite Length Analysis of Caching-Aided Coded Multicasting

    Full text link
    In this work, we study a noiseless broadcast link serving KK users whose requests arise from a library of NN files. Every user is equipped with a cache of size MM files each. It has been shown that by splitting all the files into packets and placing individual packets in a random independent manner across all the caches, it requires at most N/MN/M file transmissions for any set of demands from the library. The achievable delivery scheme involves linearly combining packets of different files following a greedy clique cover solution to the underlying index coding problem. This remarkable multiplicative gain of random placement and coded delivery has been established in the asymptotic regime when the number of packets per file FF scales to infinity. In this work, we initiate the finite-length analysis of random caching schemes when the number of packets FF is a function of the system parameters M,N,KM,N,K. Specifically, we show that existing random placement and clique cover delivery schemes that achieve optimality in the asymptotic regime can have at most a multiplicative gain of 22 if the number of packets is sub-exponential. Further, for any clique cover based coded delivery and a large class of random caching schemes, that includes the existing ones, we show that the number of packets required to get a multiplicative gain of 43g\frac{4}{3}g is at least O((N/M)g)O((N/M)^g). We exhibit a random placement and an efficient clique cover based coded delivery scheme that approximately achieves this lower bound. We also provide tight concentration results that show that the average (over the random caching involved) number of transmissions concentrates very well requiring only polynomial number of packets in the rest of the parameters.Comment: A shorter version appeared in the 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton), 201

    Speeding up Future Video Distribution via Channel-Aware Caching-Aided Coded Multicast

    Full text link
    Future Internet usage will be dominated by the consumption of a rich variety of online multimedia services accessed from an exponentially growing number of multimedia capable mobile devices. As such, future Internet designs will be challenged to provide solutions that can deliver bandwidth-intensive, delay-sensitive, on-demand video-based services over increasingly crowded, bandwidth-limited wireless access networks. One of the main reasons for the bandwidth stress facing wireless network operators is the difficulty to exploit the multicast nature of the wireless medium when wireless users or access points rarely experience the same channel conditions or access the same content at the same time. In this paper, we present and analyze a novel wireless video delivery paradigm based on the combined use of channel-aware caching and coded multicasting that allows simultaneously serving multiple cache-enabled receivers that may be requesting different content and experiencing different channel conditions. To this end, we reformulate the caching-aided coded multicast problem as a joint source-channel coding problem and design an achievable scheme that preserves the cache-enabled multiplicative throughput gains of the error-free scenario,by guaranteeing per-receiver rates unaffected by the presence of receivers with worse channel conditions.Comment: 11 pages,6 figures,to appear in IEEE JSAC Special Issue on Video Distribution over Future Interne

    Distortion-Memory Tradeoffs in Cache-Aided Wireless Video Delivery

    Full text link
    Mobile network operators are considering caching as one of the strategies to keep up with the increasing demand for high-definition wireless video streaming. By prefetching popular content into memory at wireless access points or end user devices, requests can be served locally, relieving strain on expensive backhaul. In addition, using network coding allows the simultaneous serving of distinct cache misses via common coded multicast transmissions, resulting in significantly larger load reductions compared to those achieved with conventional delivery schemes. However, prior work does not exploit the properties of video and simply treats content as fixed-size files that users would like to fully download. Our work is motivated by the fact that video can be coded in a scalable fashion and that the decoded video quality depends on the number of layers a user is able to receive. Using a Gaussian source model, caching and coded delivery methods are designed to minimize the squared error distortion at end user devices. Our work is general enough to consider heterogeneous cache sizes and video popularity distributions.Comment: To appear in Allerton 2015 Proceedings of the 53rd annual Allerton conference on Communication, control, and computin
    • …
    corecore