402 research outputs found

    Softwarization in Future Mobile Networks and Energy Efficient Networks

    Get PDF
    The data growth generated by pervasive mobile devices and the Internet of Things at the network edge (i.e., closer to mobile users), couple with the demand for ultra-low latency, requires high computation resources which are not available at the end-user device. This demands a new network design paradigm in order to handle user demands. As a remedy, a new MN network design paradigm has emerged, called Mobile Edge Computing (MEC), to enable low-latency and location-aware data processing at the network edge. MEC is based on network function virtualization (NFV) technology, where mobile network functions (NFs) that formerly existed in the evolved packet core (EPC) are moved to the access network [i.e., they are deployed on local cloud platforms in proximity to the base stations (BSs)]. In order to reap the full benefits of the virtualized infrastructure, the NFV technology shall be combined with intelligent mechanisms for handling network resources. Despite the potential benefits presented by MEC, energy consumption is a challenge due to the foreseen dense deployment of BSs empowered with computation capabilities. In the effort to build greener 5G mobile network (MN), we advocate the integration of energy harvesting (EH) into future edge systems

    A review on green caching strategies for next generation communication networks

    Get PDF
    © 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching

    A Survey of Deep Learning for Data Caching in Edge Network

    Full text link
    The concept of edge caching provision in emerging 5G and beyond mobile networks is a promising method to deal both with the traffic congestion problem in the core network as well as reducing latency to access popular content. In that respect end user demand for popular content can be satisfied by proactively caching it at the network edge, i.e, at close proximity to the users. In addition to model based caching schemes learning-based edge caching optimizations has recently attracted significant attention and the aim hereafter is to capture these recent advances for both model based and data driven techniques in the area of proactive caching. This paper summarizes the utilization of deep learning for data caching in edge network. We first outline the typical research topics in content caching and formulate a taxonomy based on network hierarchical structure. Then, a number of key types of deep learning algorithms are presented, ranging from supervised learning to unsupervised learning as well as reinforcement learning. Furthermore, a comparison of state-of-the-art literature is provided from the aspects of caching topics and deep learning methods. Finally, we discuss research challenges and future directions of applying deep learning for cachin
    corecore