420 research outputs found

    KK Users Caching Two Files: An Improved Achievable Rate

    Full text link
    Caching is an approach to smoothen the variability of traffic over time. Recently it has been proved that the local memories at the users can be exploited for reducing the peak traffic in a much more efficient way than previously believed. In this work we improve upon the existing results and introduce a novel caching strategy that takes advantage of simultaneous coded placement and coded delivery in order to decrease the worst case achievable rate with 22 files and KK users. We will show that for any cache size 1K<M<1\frac{1}{K}<M<1 our scheme outperforms the state of the art

    Fundamental Limits on Latency in Transceiver Cache-Aided HetNets

    Full text link
    Stringent mobile usage characteristics force wire- less networks to undergo a paradigm shift from conventional connection-centric to content-centric deployment. With respect to 5G, caching and heterogeneous networks (HetNet) are key technologies that will facilitate the evolution of highly content- centric networks by facilitating unified quality of service in terms of low-latency communication. In this paper, we study the impact of transceiver caching on the latency for a HetNet consisting of a single user, a receiver and one cache-assisted transceiver. We define an information-theoretic metric, the delivery time per bit (DTB), that captures the delivery latency. We establish coinciding lower and upper bounds on the DTB as a function of cache size and wireless channel parameters; thus, enabling a complete characterization of the DTB optimality of the network under study. As a result, we identify cache beneficial and non-beneficial channel regimes.Comment: 5 pages, ISIT 201
    • …
    corecore