44 research outputs found

    Green OFDMA Resource Allocation in Cache-Enabled CRAN

    Full text link
    Cloud radio access network (CRAN), in which remote radio heads (RRHs) are deployed to serve users in a target area, and connected to a central processor (CP) via limited-capacity links termed the fronthaul, is a promising candidate for the next-generation wireless communication systems. Due to the content-centric nature of future wireless communications, it is desirable to cache popular contents beforehand at the RRHs, to reduce the burden on the fronthaul and achieve energy saving through cooperative transmission. This motivates our study in this paper on the energy efficient transmission in an orthogonal frequency division multiple access (OFDMA)-based CRAN with multiple RRHs and users, where the RRHs can prefetch popular contents. We consider a joint optimization of the user-SC assignment, RRH selection and transmit power allocation over all the SCs to minimize the total transmit power of the RRHs, subject to the RRHs' individual fronthaul capacity constraints and the users' minimum rate constraints, while taking into account the caching status at the RRHs. Although the problem is non-convex, we propose a Lagrange duality based solution, which can be efficiently computed with good accuracy. We compare the minimum transmit power required by the proposed algorithm with different caching strategies against the case without caching by simulations, which show the significant energy saving with caching.Comment: Presented in IEEE Online Conference on Green Communications (Online GreenComm), Nov. 2016 (Invited Paper

    Echo State Networks for Proactive Caching in Cloud-Based Radio Access Networks with Mobile Users

    Full text link
    In this paper, the problem of proactive caching is studied for cloud radio access networks (CRANs). In the studied model, the baseband units (BBUs) can predict the content request distribution and mobility pattern of each user, determine which content to cache at remote radio heads and BBUs. This problem is formulated as an optimization problem which jointly incorporates backhaul and fronthaul loads and content caching. To solve this problem, an algorithm that combines the machine learning framework of echo state networks with sublinear algorithms is proposed. Using echo state networks (ESNs), the BBUs can predict each user's content request distribution and mobility pattern while having only limited information on the network's and user's state. In order to predict each user's periodic mobility pattern with minimal complexity, the memory capacity of the corresponding ESN is derived for a periodic input. This memory capacity is shown to be able to record the maximum amount of user information for the proposed ESN model. Then, a sublinear algorithm is proposed to determine which content to cache while using limited content request distribution samples. Simulation results using real data from Youku and the Beijing University of Posts and Telecommunications show that the proposed approach yields significant gains, in terms of sum effective capacity, that reach up to 27.8% and 30.7%, respectively, compared to random caching with clustering and random caching without clustering algorithm.Comment: Accepted in the IEEE Transactions on Wireless Communication

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Edge-Caching Wireless Networks: Performance Analysis and Optimization

    Get PDF
    Edge-caching has received much attention as an efficient technique to reduce delivery latency and network congestion during peak-traffic times by bringing data closer to end users. Existing works usually design caching algorithms separately from physical layer design. In this paper, we analyse edge-caching wireless networks by taking into account the caching capability when designing the signal transmission. Particularly, we investigate multi-layer caching where both base station (BS) and users are capable of storing content data in their local cache and analyse the performance of edge-caching wireless networks under two notable uncoded and coded caching strategies. Firstly, we propose a coded caching strategy that is applied to arbitrary values of cache size. The required backhaul and access rates are derived as a function of the BS and user cache size. Secondly, closed-form expressions for the system energy efficiency (EE) corresponding to the two caching methods are derived. Based on the derived formulas, the system EE is maximized via precoding vectors design and optimization while satisfying a predefined user request rate. Thirdly, two optimization problems are proposed to minimize the content delivery time for the two caching strategies. Finally, numerical results are presented to verify the effectiveness of the two caching methods.Comment: to appear in IEEE Trans. Wireless Commu
    corecore