7,995 research outputs found

    Leveraging intelligence from network CDR data for interference aware energy consumption minimization

    Get PDF
    Cell densification is being perceived as the panacea for the imminent capacity crunch. However, high aggregated energy consumption and increased inter-cell interference (ICI) caused by densification, remain the two long-standing problems. We propose a novel network orchestration solution for simultaneously minimizing energy consumption and ICI in ultra-dense 5G networks. The proposed solution builds on a big data analysis of over 10 million CDRs from a real network that shows there exists strong spatio-temporal predictability in real network traffic patterns. Leveraging this we develop a novel scheme to pro-actively schedule radio resources and small cell sleep cycles yielding substantial energy savings and reduced ICI, without compromising the users QoS. This scheme is derived by formulating a joint Energy Consumption and ICI minimization problem and solving it through a combination of linear binary integer programming, and progressive analysis based heuristic algorithm. Evaluations using: 1) a HetNet deployment designed for Milan city where big data analytics are used on real CDRs data from the Telecom Italia network to model traffic patterns, 2) NS-3 based Monte-Carlo simulations with synthetic Poisson traffic show that, compared to full frequency reuse and always on approach, in best case, proposed scheme can reduce energy consumption in HetNets to 1/8th while providing same or better Qo

    Energy efficient hybrid satellite terrestrial 5G networks with software defined features

    Get PDF
    In order to improve the manageability and adaptability of future 5G wireless networks, the software orchestration mechanism, named software defined networking (SDN) with Control and User plane (C/U-plane) decoupling, has become one of the most promising key techniques. Based on these features, the hybrid satellite terrestrial network is expected to support flexible and customized resource scheduling for both massive machinetype- communication (MTC) and high-quality multimedia requests while achieving broader global coverage, larger capacity and lower power consumption. In this paper, an end-to-end hybrid satellite terrestrial network is proposed and the performance metrics, e. g., coverage probability, spectral and energy efficiency (SE and EE), are analysed in both sparse networks and ultra-dense networks. The fundamental relationship between SE and EE is investigated, considering the overhead costs, fronthaul of the gateway (GW), density of small cells (SCs) and multiple quality-ofservice (QoS) requirements. Numerical results show that compared with current LTE networks, the hybrid system with C/U split can achieve approximately 40% and 80% EE improvement in sparse and ultra-dense networks respectively, and greatly enhance the coverage. Various resource management schemes, bandwidth allocation methods, and on-off approaches are compared, and the applications of the satellite in future 5G networks with software defined features are proposed

    Integration of heterogeneous devices and communication models via the cloud in the constrained internet of things

    Get PDF
    As the Internet of Things continues to expand in the coming years, the need for services that span multiple IoT application domains will continue to increase in order to realize the efficiency gains promised by the IoT. Today, however, service developers looking to add value on top of existing IoT systems are faced with very heterogeneous devices and systems. These systems implement a wide variety of network connectivity options, protocols (proprietary or standards-based), and communication methods all of which are unknown to a service developer that is new to the IoT. Even within one IoT standard, a device typically has multiple options for communicating with others. In order to alleviate service developers from these concerns, this paper presents a cloud-based platform for integrating heterogeneous constrained IoT devices and communication models into services. Our evaluation shows that the impact of our approach on the operation of constrained devices is minimal while providing a tangible benefit in service integration of low-resource IoT devices. A proof of concept demonstrates the latter by means of a control and management dashboard for constrained devices that was implemented on top of the presented platform. The results of our work enable service developers to more easily implement and deploy services that span a wide variety of IoT application domains

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    corecore