102 research outputs found

    A review on green caching strategies for next generation communication networks

    Get PDF
    © 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching

    Exploiting Caching and Multicast for 5G Wireless Networks

    Get PDF
    The landscape toward 5G wireless communication is currently unclear, and, despite the efforts of academia and industry in evolving traditional cellular networks, the enabling technology for 5G is still obscure. This paper puts forward a network paradigm toward next-generation cellular networks, targeting to satisfy the explosive demand for mobile data while minimizing energy expenditures. The paradigm builds on two principles; namely caching and multicast. On one hand, caching policies disperse popular content files at the wireless edge, e.g., pico-cells and femto-cells, hence shortening the distance between content and requester. On other hand, due to the broadcast nature of wireless medium, requests for identical files occurring at nearby times are aggregated and served through a common multicast stream. To better exploit the available cache space, caching policies are optimized based on multicast transmissions. We show that the multicast-aware caching problem is NP-hard and develop solutions with performance guarantees using randomized-rounding techniques. Trace-driven numerical results show that in the presence of massive demand for delay tolerant content, combining caching and multicast can indeed reduce energy costs. The gains over existing caching schemes are 19% when users tolerate delay of three minutes, increasing further with the steepness of content access pattern

    The Role of Caching in Future Communication Systems and Networks

    Get PDF
    This paper has the following ambitious goal: to convince the reader that content caching is an exciting research topic for the future communication systems and networks. Caching has been studied for more than 40 years, and has recently received increased attention from industry and academia. Novel caching techniques promise to push the network performance to unprecedented limits, but also pose significant technical challenges. This tutorial provides a brief overview of existing caching solutions, discusses seminal papers that open new directions in caching, and presents the contributions of this special issue. We analyze the challenges that caching needs to address today, also considering an industry perspective, and identify bottleneck issues that must be resolved to unleash the full potential of this promising technique

    Content delivery over multi-antenna wireless networks

    Get PDF
    The past few decades have witnessed unprecedented advances in information technology, which have significantly shaped the way we acquire and process information in our daily lives. Wireless communications has become the main means of access to data through mobile devices, resulting in a continuous exponential growth in wireless data traffic, mainly driven by the demand for high quality content. Various technologies have been proposed by researchers to tackle this growth in 5G and beyond, including the use of increasing number of antenna elements, integrated point-to-multipoint delivery and caching, which constitute the core of this thesis. In particular, we study non-orthogonal content delivery in multiuser multiple-input-single-output (MISO) systems. First, a joint beamforming strategy for simultaneous delivery of broadcast and unicast services is investigated, based on layered division multiplexing (LDM) as a means of superposition coding. The system performance in terms of minimum required power under prescribed quality-of-service (QoS) requirements is examined in comparison with time division multiplexing (TDM). It is demonstrated through simulations that the non-orthogonal delivery strategy based on LDM significantly outperforms the orthogonal strategy based on TDM in terms of system throughput and reliability. To facilitate efficient implementation of the LDM-based beamforming design, we further propose a dual decomposition-based distributed approach. Next, we study an efficient multicast beamforming design in cache-aided multiuser MISO systems, exploiting proactive content placement and coded delivery. It is observed that the complexity of this problem grows exponentially with the number of subfiles delivered to each user in each time slot, which itself grows exponentially with the number of users in the system. Therefore, we propose a low-complexity alternative through time-sharing that limits the number of subfiles that can be received by a user in each time slot. Moreover, a joint design of content delivery and multicast beamforming is proposed to further enhance the system performance, under the constraint on maximum number of subfiles each user can decode in each time slot. Finally, conclusions are drawn in Chapter 5, followed by an outlook for future works.Open Acces

    Quality-driven management of video streaming services in segment-based cache networks

    Get PDF

    Implementation and Evaluation of Mobile-Edge Computing Cooperative Caching

    Get PDF
    Recent expanding rise of mobile device users for cloud services leads to resource challenges in Mobile Network Operator's (MNO) network. This poses significant additional costs to MNOs and also results in poor user experience. Studies illustrate that large amount of traffic consumption in MNO's network is originated from the similar requests of users for the same popular contents over Internet. Therefore such networks suffer from delivering the same content multiple times through their connected gateways to the Internet backhaul. On the other hand, in content delivery networks (CDN), the delay caused by network latency is one of the biggest issues which impedes the efficient delivery and desirable user experience. Cooperative caching is one of the ways to handle the extra posed traffic by requesting popular contents repeatedly in MNO's network. Furthermore Mobile-Edge Computing (MEC) offers a resource rich environment and data locality to cloud applications. This helps to reduce the network latency time in CDN services. Thus in this Thesis an aggregation between Cooperative Caching and MEC concept has been considered. This Thesis demonstrates a design, implementation and evaluation for a Mobile-Edge computing Cooperative Caching system to deliver content to mobile users. A design is presented in a failure resilient and scalable practice using a light-weight synchronizing method. The system is implemented and deployed on Nokia Networks Radio Application Cloud Servers(Nokia Networks RACS) as intelligent MEC base-stations and finally the outcome of the system and the effect on bandwidth saving, CDN delay and user experience are evaluated
    corecore