197 research outputs found

    Cache-Enabled Broadcast Packet Erasure Channels with State Feedback

    Full text link
    We consider a cache-enabled K-user broadcast erasure packet channel in which a server with a library of N files wishes to deliver a requested file to each user who is equipped with a cache of a finite memory M. Assuming that the transmitter has state feedback and user caches can be filled during off-peak hours reliably by decentralized cache placement, we characterize the optimal rate region as a function of the memory size, the erasure probability. The proposed delivery scheme, based on the scheme proposed by Gatzianas et al., exploits the receiver side information established during the placement phase. Our results enable us to quantify the net benefits of decentralized coded caching in the presence of erasure. The role of state feedback is found useful especially when the erasure probability is large and/or the normalized memory size is small.Comment: 8 pages, 4 figures, to be presented at the 53rd Annual Allerton Conference on Communication, Control, and Computing, IL, US

    Content Delivery in Erasure Broadcast Channels with Cache and Feedback

    Full text link
    We study a content delivery problem in a K-user erasure broadcast channel such that a content providing server wishes to deliver requested files to users, each equipped with a cache of a finite memory. Assuming that the transmitter has state feedback and user caches can be filled during off-peak hours reliably by the decentralized content placement, we characterize the achievable rate region as a function of the memory sizes and the erasure probabilities. The proposed delivery scheme, based on the broadcasting scheme by Wang and Gatzianas et al., exploits the receiver side information established during the placement phase. Our results can be extended to centralized content placement as well as multi-antenna broadcast channels with state feedback.Comment: 29 pages, 7 figures. A short version has been submitted to ISIT 201

    Speeding up Future Video Distribution via Channel-Aware Caching-Aided Coded Multicast

    Full text link
    Future Internet usage will be dominated by the consumption of a rich variety of online multimedia services accessed from an exponentially growing number of multimedia capable mobile devices. As such, future Internet designs will be challenged to provide solutions that can deliver bandwidth-intensive, delay-sensitive, on-demand video-based services over increasingly crowded, bandwidth-limited wireless access networks. One of the main reasons for the bandwidth stress facing wireless network operators is the difficulty to exploit the multicast nature of the wireless medium when wireless users or access points rarely experience the same channel conditions or access the same content at the same time. In this paper, we present and analyze a novel wireless video delivery paradigm based on the combined use of channel-aware caching and coded multicasting that allows simultaneously serving multiple cache-enabled receivers that may be requesting different content and experiencing different channel conditions. To this end, we reformulate the caching-aided coded multicast problem as a joint source-channel coding problem and design an achievable scheme that preserves the cache-enabled multiplicative throughput gains of the error-free scenario,by guaranteeing per-receiver rates unaffected by the presence of receivers with worse channel conditions.Comment: 11 pages,6 figures,to appear in IEEE JSAC Special Issue on Video Distribution over Future Interne

    Cache-aided content delivery over erasure broadcast channels

    Get PDF
    A cache-aided broadcast network is studied, in which a server delivers contents to a group of receivers over a packet erasure broadcast channel (BC). The receivers are divided into two sets with regards to their channel qualities: the weak and strong receivers, where all the weak receivers have statistically worse channel qualities than all the strong receivers. The weak receivers, in order to compensate for the high erasure probability they encounter over the channel, are equipped with cache memories of equal size, while the receivers in the strong set have no caches. Data can be pre-delivered to weak receivers’ caches over the off-peak traffic period before the receivers reveal their demands. Allowing arbitrary erasure probabilities for the weak and strong receivers, a joint caching and channel coding scheme, which divides each file into several subfiles, and applies a different caching and delivery scheme for each subfile, is proposed. It is shown that all the receivers, even those without any cache memories, benefit from the presence of caches across the network. An information theoretic trade-off between the cache size and the achievable rate is formulated. It is shown that the proposed scheme improves upon the state-of-the-art in terms of the achievable trade-off
    • …
    corecore