181 research outputs found

    Optimal Error Correcting Delivery Scheme for Coded Caching with Symmetric Batch Prefetching

    Full text link
    Coded caching is used to reduce network congestion during peak hours. A single server is connected to a set of users through a bottleneck link, which generally is assumed to be error-free. During non-peak hours, all the users have full access to the files and they fill their local cache with portions of the files available. During delivery phase, each user requests a file and the server delivers coded transmissions to meet the demands taking into consideration their cache contents. In this paper we assume that the shared link is error prone. A new delivery scheme is required to meet the demands of each user even after receiving finite number of transmissions in error. We characterize the minimum average rate and minimum peak rate for this problem. We find closed form expressions of these rates for a particular caching scheme namely \textit{symmetric batch prefetching}. We also propose an optimal error correcting delivery scheme for coded caching problem with symmetric batch prefetching.Comment: 9 pages and 4 figure

    Optimal Error Correcting Delivery Scheme for an Optimal Coded Caching Scheme with Small Buffers

    Full text link
    Optimal delivery scheme for coded caching problems with small buffer sizes and the number of users no less than the amount of files in the server was proposed by Chen, Fan and Letaief ["Fundamental limits of caching: improved bounds for users with small buffers," (IET Communications), 2016]. This scheme is referred to as the CFL scheme. In this paper, the link between the server and the users is assumed to be error prone only during the delivery phase. Closed form expressions for average rate and peak rate of error correcting delivery scheme for CFL prefetching scheme is obtained. An optimal error correcting delivery scheme for caching problems employing CFL prefetching is proposed.Comment: 7 page

    Multi-access Coded Caching with Decentralized Prefetching

    Full text link
    An extension of coded caching referred to as multi-access coded caching where each user can access multiple caches and each cache can serve multiple users is considered in this paper. Most of the literature in multi-access coded caching focuses on cyclic wrap-around cache access where each user is allowed to access an exclusive set of consecutive caches only. In this paper, a more general framework of multi-access caching problem is considered in which each user is allowed to randomly connect to a specific number of caches and multiple users can access the same set of caches. For the proposed system model considering decentralized prefetching, a new delivery scheme is proposed and an expression for per user delivery rate is obtained. A lower bound on the delivery rate is derived using techniques from index coding. The proposed scheme is shown to be optimal among all the linear schemes under certain conditions. An improved delivery rate and a lower bound for the decentralized multi-access coded caching scheme with cyclic wrap-around cache access can be obtained as a special case. By giving specific values to certain parameters, the results of decentralized shared caching scheme and of conventional decentralized caching scheme can be recovered.Comment: 26 pages, 6 figures, 6 tables, Submitted to IEEE Transactions on Communication
    corecore