130 research outputs found

    A review on green caching strategies for next generation communication networks

    Get PDF
    © 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching

    Addressing Application Latency Requirements through Edge Scheduling

    Get PDF
    Abstract Latency-sensitive and data-intensive applications, such as IoT or mobile services, are leveraged by Edge computing, which extends the cloud ecosystem with distributed computational resources in proximity to data providers and consumers. This brings significant benefits in terms of lower latency and higher bandwidth. However, by definition, edge computing has limited resources with respect to cloud counterparts; thus, there exists a trade-off between proximity to users and resource utilization. Moreover, service availability is a significant concern at the edge of the network, where extensive support systems as in cloud data centers are not usually present. To overcome these limitations, we propose a score-based edge service scheduling algorithm that evaluates network, compute, and reliability capabilities of edge nodes. The algorithm outputs the maximum scoring mapping between resources and services with regard to four critical aspects of service quality. Our simulation-based experiments on live video streaming services demonstrate significant improvements in both network delay and service time. Moreover, we compare edge computing with cloud computing and content delivery networks within the context of latency-sensitive and data-intensive applications. The results suggest that our edge-based scheduling algorithm is a viable solution for high service quality and responsiveness in deploying such applications

    A Survey on Mobile Edge Computing for Video Streaming : Opportunities and Challenges

    Get PDF
    5G communication brings substantial improvements in the quality of service provided to various applications by achieving higher throughput and lower latency. However, interactive multimedia applications (e.g., ultra high definition video conferencing, 3D and multiview video streaming, crowd-sourced video streaming, cloud gaming, virtual and augmented reality) are becoming more ambitious with high volume and low latency video streams putting strict demands on the already congested networks. Mobile Edge Computing (MEC) is an emerging paradigm that extends cloud computing capabilities to the edge of the network i.e., at the base station level. To meet the latency requirements and avoid the end-to-end communication with remote cloud data centers, MEC allows to store and process video content (e.g., caching, transcoding, pre-processing) at the base stations. Both video on demand and live video streaming can utilize MEC to improve existing services and develop novel use cases, such as video analytics, and targeted advertisements. MEC is expected to reshape the future of video streaming by providing ultra-reliable and low latency streaming (e.g., in augmented reality, virtual reality, and autonomous vehicles), pervasive computing (e.g., in real-time video analytics), and blockchain-enabled architecture for secure live streaming. This paper presents a comprehensive survey of recent developments in MEC-enabled video streaming bringing unprecedented improvement to enable novel use cases. A detailed review of the state-of-the-art is presented covering novel caching schemes, optimal computation offloading, cooperative caching and offloading and the use of artificial intelligence (i.e., machine learning, deep learning, and reinforcement learning) in MEC-assisted video streaming services.publishedVersionPeer reviewe

    Video Caching, Analytics and Delivery at the Wireless Edge: A Survey and Future Directions

    Get PDF
    Future wireless networks will provide high bandwidth, low-latency, and ultra-reliable Internet connectivity to meet the requirements of different applications, ranging from mobile broadband to the Internet of Things. To this aim, mobile edge caching, computing, and communication (edge-C3) have emerged to bring network resources (i.e., bandwidth, storage, and computing) closer to end users. Edge-C3 allows improving the network resource utilization as well as the quality of experience (QoE) of end users. Recently, several video-oriented mobile applications (e.g., live content sharing, gaming, and augmented reality) have leveraged edge-C3 in diverse scenarios involving video streaming in both the downlink and the uplink. Hence, a large number of recent works have studied the implications of video analysis and streaming through edge-C3. This article presents an in-depth survey on video edge-C3 challenges and state-of-the-art solutions in next-generation wireless and mobile networks. Specifically, it includes: a tutorial on video streaming in mobile networks (e.g., video encoding and adaptive bitrate streaming); an overview of mobile network architectures, enabling technologies, and applications for video edge-C3; video edge computing and analytics in uplink scenarios (e.g., architectures, analytics, and applications); and video edge caching, computing and communication methods in downlink scenarios (e.g., collaborative, popularity-based, and context-aware). A new taxonomy for video edge-C3 is proposed and the major contributions of recent studies are first highlighted and then systematically compared. Finally, several open problems and key challenges for future research are outlined

    Life-cycle management and placement of service function chains in MEC-enabled 5G networks

    Get PDF
    Recent advancements in mobile communication technology have led to the fifth generation of mobile cellular networks (5G), driven by the proliferation in data traffic demand, stringent latency requirements, and the desire for a fully connected world. This transformation calls for novel technology solutions such as Multi-access Edge Computing (MEC) and Network Function Virtualization (NFV) to satisfy service requirements while providing dynamic and instant service deployment. MEC and NFV are two principal and complementary enablers for 5G networks whose co-existence can lead to numerous benefits. Despite the numerous advantages MEC offers, physical resources at the edge are extremely scarce and require efficient utilization. In this doctoral dissertation, we first attempt to optimize resource utilization at the network edge for the scenario of live video streaming. We specifically utilize the real-time Radio Access Network (RAN) information available at the MEC servers to develop a machine learning-based prediction solution and anticipate user requests. Consequently, Integer Linear Programming (ILP) models are used to prefetch/cache video contents from a centralized video server. Regarding the advantages of NFV technology for the deployment of NFs, the second problem that this dissertation address is the proper association of the users to the gNBs along with efficient placement of SFCs on the substrate network. Our primary purpose is to find a proper embedding of the SFC in a hierarchical 5G network. The problem is formulated as a Mixed Integer Linear Programming (MILP) model, having the objective to minimize service provisioning cost, link utilization, and the effect of VNF migration on users' perceived quality of experience. After rigorously analyzing the proposed SFC placement and considering mobile networks' dynamicity, our next goal is to develop an ILP-based model that minimizes the resource provisioning cost by dynamically embed and scale SFCs so that provisioning cost is minimized while user requirements are met

    Service placement and request routing in MEC networks with storage, computation, and communication constraints

    Get PDF
    The proliferation of innovative mobile services such as augmented reality, networked gaming, and autonomous driving has spurred a growing need for low-latency access to computing resources that cannot be met solely by existing centralized cloud systems. Mobile Edge Computing (MEC) is expected to be an effective solution to meet the demand for low-latency services by enabling the execution of computing tasks at the network edge, in proximity to the end-users. While a number of recent studies have addressed the problem of determining the execution of service tasks and the routing of user requests to corresponding edge servers, the focus has primarily been on the efficient utilization of computing resources, neglecting the fact that non-trivial amounts of data need to be pre-stored to enable service execution, and that many emerging services exhibit asymmetric bandwidth requirements. To fill this gap, we study the joint optimization of service placement and request routing in dense MEC networks with multidimensional constraints. We show that this problem generalizes several well-known placement and routing problems and propose an algorithm that achieves close-to-optimal performance using a randomized rounding technique. Evaluation results demonstrate that our approach can effectively utilize available storage, computation, and communication resources to maximize the number of requests served by low-latency edge cloud servers
    corecore