5,011 research outputs found

    Proactive edge caching in content-centric networks with massive dynamic content requests

    Get PDF
    Edge computing is a promising infrastructure evolution to reduce traffic loads and support low-latency communications. Furthermore, content-centric networks provide a natural solution to cache contents at edge nodes. However, it is a challenge for edge nodes to handle massive and highly dynamic content requests by users, and if without an efficient content caching strategy, the edge nodes will encounter high traffic load and latency due to increasing retrieval from content providers. This paper formulates a proactive edge caching problem to minimize the content retrieval cost at edge nodes. We exploit the inherent content caching and request aggregation mechanism in the content-centric networks to jointly minimize traffic load and content retrieval delay cost generated by the massive and dynamic content requests. We develop a Q-learning algorithm, which is an online optimal caching strategy, as it is adaptable to dynamic content popularity and content request intensity, and derive the long-term minimization of the content retrieval cost. Simulation results illustrate that the proposed algorithm can achieve a lower content retrieval cost compared with several baseline caching schemes

    Fundamental Limits of Caching with Secure Delivery

    Full text link
    Caching is emerging as a vital tool for alleviating the severe capacity crunch in modern content-centric wireless networks. The main idea behind caching is to store parts of popular content in end-users' memory and leverage the locally stored content to reduce peak data rates. By jointly designing content placement and delivery mechanisms, recent works have shown order-wise reduction in transmission rates in contrast to traditional methods. In this work, we consider the secure caching problem with the additional goal of minimizing information leakage to an external wiretapper. The fundamental cache memory vs. transmission rate trade-off for the secure caching problem is characterized. Rather surprisingly, these results show that security can be introduced at a negligible cost, particularly for large number of files and users. It is also shown that the rate achieved by the proposed caching scheme with secure delivery is within a constant multiplicative factor from the information-theoretic optimal rate for almost all parameter values of practical interest

    Asymptotically-Optimal Incentive-Based En-Route Caching Scheme

    Full text link
    Content caching at intermediate nodes is a very effective way to optimize the operations of Computer networks, so that future requests can be served without going back to the origin of the content. Several caching techniques have been proposed since the emergence of the concept, including techniques that require major changes to the Internet architecture such as Content Centric Networking. Few of these techniques consider providing caching incentives for the nodes or quality of service guarantees for content owners. In this work, we present a low complexity, distributed, and online algorithm for making caching decisions based on content popularity, while taking into account the aforementioned issues. Our algorithm performs en-route caching. Therefore, it can be integrated with the current TCP/IP model. In order to measure the performance of any online caching algorithm, we define the competitive ratio as the ratio of the performance of the online algorithm in terms of traffic savings to the performance of the optimal offline algorithm that has a complete knowledge of the future. We show that under our settings, no online algorithm can achieve a better competitive ratio than Ī©(logā”n)\Omega(\log n), where nn is the number of nodes in the network. Furthermore, we show that under realistic scenarios, our algorithm has an asymptotically optimal competitive ratio in terms of the number of nodes in the network. We also study an extension to the basic algorithm and show its effectiveness through extensive simulations

    Cooperative Caching and Transmission Design in Cluster-Centric Small Cell Networks

    Full text link
    Wireless content caching in small cell networks (SCNs) has recently been considered as an efficient way to reduce the traffic and the energy consumption of the backhaul in emerging heterogeneous cellular networks (HetNets). In this paper, we consider a cluster-centric SCN with combined design of cooperative caching and transmission policy. Small base stations (SBSs) are grouped into disjoint clusters, in which in-cluster cache space is utilized as an entity. We propose a combined caching scheme where part of the available cache space is reserved for caching the most popular content in every SBS, while the remaining is used for cooperatively caching different partitions of the less popular content in different SBSs, as a means to increase local content diversity. Depending on the availability and placement of the requested content, coordinated multipoint (CoMP) technique with either joint transmission (JT) or parallel transmission (PT) is used to deliver content to the served user. Using Poisson point process (PPP) for the SBS location distribution and a hexagonal grid model for the clusters, we provide analytical results on the successful content delivery probability of both transmission schemes for a user located at the cluster center. Our analysis shows an inherent tradeoff between transmission diversity and content diversity in our combined caching-transmission design. We also study optimal cache space assignment for two objective functions: maximization of the cache service performance and the energy efficiency. Simulation results show that the proposed scheme achieves performance gain by leveraging cache-level and signal-level cooperation and adapting to the network environment and user QoS requirements.Comment: 13 pages, 10 figures, submitted for possible journal publicatio

    Online algorithms for content caching: an economic perspective

    Get PDF
    Content Caching at intermediate nodes, such that future requests can be served without going back to the origin of the content, is an effective way to optimize the operations of computer networks. Therefore, content caching reduces the delivery delay and improves the usersā€™ Quality of Experience (QoE). The current literature either proposes offline algorithms that have complete knowledge of the request profile a priori, or proposes heuristics without provable performance. In this dissertation, online algorithms are presented for content caching in three different network settings: the current Internet Network, collaborative multi-cell coordinated network, and future Content Centric Networks (CCN). Due to the difficulty of obtaining a prior knowledge of contentsā€™ popularities in real scenarios, an algorithm has to make a decision whether to cache a content or not when a request for the content is made, and without the knowledge of any future requests. The performance of the online algorithms is measured through a competitive ratio analysis, comparing the performance of the online algorithm to that of an omniscient optimal offline algorithm. Through theoretical analyses, it is shown that the proposed online algorithms achieve either the optimal or close to the optimal competitive ratio. Moreover, the algorithms have low complexity and can be implemented in a distributed way. The theoretical analyses are complemented with simulation-based experiments, and it is shown that the online algorithms have better performance compared to the state of the art caching schemes
    • ā€¦
    corecore