1,642 research outputs found

    Energy efficient hybrid satellite terrestrial 5G networks with software defined features

    Get PDF
    In order to improve the manageability and adaptability of future 5G wireless networks, the software orchestration mechanism, named software defined networking (SDN) with Control and User plane (C/U-plane) decoupling, has become one of the most promising key techniques. Based on these features, the hybrid satellite terrestrial network is expected to support flexible and customized resource scheduling for both massive machinetype- communication (MTC) and high-quality multimedia requests while achieving broader global coverage, larger capacity and lower power consumption. In this paper, an end-to-end hybrid satellite terrestrial network is proposed and the performance metrics, e. g., coverage probability, spectral and energy efficiency (SE and EE), are analysed in both sparse networks and ultra-dense networks. The fundamental relationship between SE and EE is investigated, considering the overhead costs, fronthaul of the gateway (GW), density of small cells (SCs) and multiple quality-ofservice (QoS) requirements. Numerical results show that compared with current LTE networks, the hybrid system with C/U split can achieve approximately 40% and 80% EE improvement in sparse and ultra-dense networks respectively, and greatly enhance the coverage. Various resource management schemes, bandwidth allocation methods, and on-off approaches are compared, and the applications of the satellite in future 5G networks with software defined features are proposed

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Leveraging intelligence from network CDR data for interference aware energy consumption minimization

    Get PDF
    Cell densification is being perceived as the panacea for the imminent capacity crunch. However, high aggregated energy consumption and increased inter-cell interference (ICI) caused by densification, remain the two long-standing problems. We propose a novel network orchestration solution for simultaneously minimizing energy consumption and ICI in ultra-dense 5G networks. The proposed solution builds on a big data analysis of over 10 million CDRs from a real network that shows there exists strong spatio-temporal predictability in real network traffic patterns. Leveraging this we develop a novel scheme to pro-actively schedule radio resources and small cell sleep cycles yielding substantial energy savings and reduced ICI, without compromising the users QoS. This scheme is derived by formulating a joint Energy Consumption and ICI minimization problem and solving it through a combination of linear binary integer programming, and progressive analysis based heuristic algorithm. Evaluations using: 1) a HetNet deployment designed for Milan city where big data analytics are used on real CDRs data from the Telecom Italia network to model traffic patterns, 2) NS-3 based Monte-Carlo simulations with synthetic Poisson traffic show that, compared to full frequency reuse and always on approach, in best case, proposed scheme can reduce energy consumption in HetNets to 1/8th while providing same or better Qo

    Achieving Ultra-Low Latency in 5G Millimeter Wave Cellular Networks

    Full text link
    The IMT 2020 requirements of 20 Gbps peak data rate and 1 millisecond latency present significant engineering challenges for the design of 5G cellular systems. Use of the millimeter wave (mmWave) bands above 10 GHz --- where vast quantities of spectrum are available --- is a promising 5G candidate that may be able to rise to the occasion. However, while the mmWave bands can support massive peak data rates, delivering these data rates on end-to-end service while maintaining reliability and ultra-low latency performance will require rethinking all layers of the protocol stack. This papers surveys some of the challenges and possible solutions for delivering end-to-end, reliable, ultra-low latency services in mmWave cellular systems in terms of the Medium Access Control (MAC) layer, congestion control and core network architecture
    corecore