36 research outputs found

    Optimizing Resource Allocation with Energy Efficiency and Backhaul Challenges

    Get PDF
    To meet the requirements of future wireless mobile communication which aims to increase the data rates, coverage and reliability while reducing energy consumption and latency, and also deal with the explosive mobile traffic growth which imposes high demands on backhaul for massive content delivery, developing green communication and reducing the backhaul requirements have become two significant trends. One of the promising techniques to provide green communication is wireless power transfer (WPT) which facilitates energy-efficient architectures, e.g. simultaneous wireless information and power transfer (SWIPT). Edge caching, on the other side, brings content closer to the users by storing popular content in caches installed at the network edge to reduce peak-time traffic, backhaul cost and latency. In this thesis, we focus on the resource allocation technology for emerging network architectures, i.e. the SWIPT-enabled multiple-antenna systems and cache-enabled cellular systems, to tackle the challenges of limited resources such as insufficient energy supply and backhaul capacity. We start with the joint design of beamforming and power transfer ratios for SWIPT in MISO broadcast channels and MIMO relay systems, respectively, aiming for maximizing the energy efficiency subject to both the Quality of Service (QoS) constraints and energy harvesting constraints. Then move to the content placement optimization for cache-enabled heterogeneous small cell networks so as to minimize the backhaul requirements. In particular, we enable multicast content delivery and cooperative content sharing utilizing maximum distance separable (MDS) codes to provide further caching gains. Both analysis and simulation results are provided throughout the thesis to demonstrate the benefits of the proposed algorithms over the state-of-the-art methods

    A review on green caching strategies for next generation communication networks

    Get PDF
    © 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching

    Cooperative Local Caching under Heterogeneous File Preferences

    Full text link
    Local caching is an effective scheme for leveraging the memory of the mobile terminal (MT) and short range communications to save the bandwidth usage and reduce the download delay in the cellular communication system. Specifically, the MTs first cache in their local memories in off-peak hours and then exchange the requested files with each other in the vicinity during peak hours. However, prior works largely overlook MTs' heterogeneity in file preferences and their selfish behaviours. In this paper, we practically categorize the MTs into different interest groups according to the MTs' preferences. Each group of MTs aims to increase the probability of successful file discovery from the neighbouring MTs (from the same or different groups). Hence, we define the groups' utilities as the probability of successfully discovering the file in the neighbouring MTs, which should be maximized by deciding the caching strategies of different groups. By modelling MTs' mobilities as homogeneous Poisson point processes (HPPPs), we analytically characterize MTs' utilities in closed-form. We first consider the fully cooperative case where a centralizer helps all groups to make caching decisions. We formulate the problem as a weighted-sum utility maximization problem, through which the maximum utility trade-offs of different groups are characterized. Next, we study two benchmark cases under selfish caching, namely, partial and no cooperation, with and without inter-group file sharing, respectively. The optimal caching distributions for these two cases are derived. Finally, numerical examples are presented to compare the utilities under different cases and show the effectiveness of the fully cooperative local caching compared to the two benchmark cases

    COCAM: a cooperative video edge caching and multicasting approach based on multi-agent deep reinforcement learning in multi-clouds environment

    Get PDF
    The evolution of the Internet of Things technology (IoT) has boosted the drastic increase in network traffic demand. Caching and multicasting in the multi-clouds scenario are effective approaches to alleviate the backhaul burden of networks and reduce service latency. However, existing works do not jointly exploit the advantages of these two approaches. In this paper, we propose COCAM, a cooperative video edge caching and multicasting approach based on multi-agent deep reinforcement learning to minimize the transmission number in the multi-clouds scenario with limited storage capacity in each edge cloud. Specifically, by integrating a cooperative transmission model with the caching model, we provide a concrete formulation of the joint problem. Then, we cast this decision-making problem as a multi-agent extension of the Markov decision process and propose a multi-agent actor-critic algorithm in which each agent learns a local caching strategy and further encompasses the observations of neighboring agents as constituents of the overall state. Finally, to validate the COCAM algorithm, we conduct extensive experiments on a real-world dataset. The results show that our proposed algorithm outperforms other baseline algorithms in terms of the number of video transmissions

    When Exploiting Individual User Preference Is Beneficial for Caching at Base Stations

    Full text link
    Most of prior works optimize caching policies based on the following assumptions: 1) every user initiates request according to content popularity, 2) all users are with the same active level, and 3) users are uniformly located in the considered region. In practice, these assumptions are often not true. In this paper, we explore the benefit of optimizing caching policies for base stations by exploiting user preference considering the spatial locality and different active level of users. We obtain optimal caching policies, respectively minimizing the download delay averaged over all file requests and user locations in the network (namely network average delay), and minimizing the maximal weighted download delay averaged over the file requests and location of each user (namely maximal weighted user average delay), as well as minimizing the weighted sum of both. The analysis and simulation results show that exploiting heterogeneous user preference and active level can improve user fairness, and can also improve network performance when users are with spatial locality.Comment: Accepted by IEEE ICC 2018 Workshop on Information-Centric Edge Computing and Caching for Future Network
    corecore