3,534 research outputs found

    A Literature Survey of Cooperative Caching in Content Distribution Networks

    Full text link
    Content distribution networks (CDNs) which serve to deliver web objects (e.g., documents, applications, music and video, etc.) have seen tremendous growth since its emergence. To minimize the retrieving delay experienced by a user with a request for a web object, caching strategies are often applied - contents are replicated at edges of the network which is closer to the user such that the network distance between the user and the object is reduced. In this literature survey, evolution of caching is studied. A recent research paper [15] in the field of large-scale caching for CDN was chosen to be the anchor paper which serves as a guide to the topic. Research studies after and relevant to the anchor paper are also analyzed to better evaluate the statements and results of the anchor paper and more importantly, to obtain an unbiased view of the large scale collaborate caching systems as a whole.Comment: 5 pages, 5 figure

    MADServer: An Architecture for Opportunistic Mobile Advanced Delivery

    Get PDF
    Rapid increases in cellular data traffic demand creative alternative delivery vectors for data. Despite the conceptual attractiveness of mobile data offloading, no concrete web server architectures integrate intelligent offloading in a production-ready and easily deployable manner without relying on vast infrastructural changes to carriers’ networks. Delay-tolerant networking technology offers the means to do just this. We introduce MADServer, a novel DTN-based architecture for mobile data offloading that splits web con- tent among multiple independent delivery vectors based on user and data context. It enables intelligent data offload- ing, caching, and querying solutions which can be incorporated in a manner that still satisfies user expectations for timely delivery. At the same time, it allows for users who have poor or expensive connections to the cellular network to leverage multi-hop opportunistic routing to send and receive data. We also present a preliminary implementation of MADServer and provide real-world performance evaluations

    On the Benefits of Edge Caching for MIMO Interference Alignment

    Full text link
    In this contribution, we jointly investigate the benefits of caching and interference alignment (IA) in multiple-input multiple-output (MIMO) interference channel under limited backhaul capacity. In particular, total average transmission rate is derived as a function of various system parameters such as backhaul link capacity, cache size, number of active transmitter-receiver pairs as well as the quantization bits for channel state information (CSI). Given the fact that base stations are equipped both with caching and IA capabilities and have knowledge of content popularity profile, we then characterize an operational regime where the caching is beneficial. Subsequently, we find the optimal number of transmitter-receiver pairs that maximizes the total average transmission rate. When the popularity profile of requested contents falls into the operational regime, it turns out that caching substantially improves the throughput as it mitigates the backhaul usage and allows IA methods to take benefit of such limited backhaul.Comment: 20 pages, 5 figures. A shorter version is to be presented at 16th IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC'2015), Stockholm, Swede
    corecore