1,374 research outputs found

    Energy efficient hybrid satellite terrestrial 5G networks with software defined features

    Get PDF
    In order to improve the manageability and adaptability of future 5G wireless networks, the software orchestration mechanism, named software defined networking (SDN) with Control and User plane (C/U-plane) decoupling, has become one of the most promising key techniques. Based on these features, the hybrid satellite terrestrial network is expected to support flexible and customized resource scheduling for both massive machinetype- communication (MTC) and high-quality multimedia requests while achieving broader global coverage, larger capacity and lower power consumption. In this paper, an end-to-end hybrid satellite terrestrial network is proposed and the performance metrics, e. g., coverage probability, spectral and energy efficiency (SE and EE), are analysed in both sparse networks and ultra-dense networks. The fundamental relationship between SE and EE is investigated, considering the overhead costs, fronthaul of the gateway (GW), density of small cells (SCs) and multiple quality-ofservice (QoS) requirements. Numerical results show that compared with current LTE networks, the hybrid system with C/U split can achieve approximately 40% and 80% EE improvement in sparse and ultra-dense networks respectively, and greatly enhance the coverage. Various resource management schemes, bandwidth allocation methods, and on-off approaches are compared, and the applications of the satellite in future 5G networks with software defined features are proposed

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Approximations of the aggregated interference statistics for outage analysis in massive MTC

    Get PDF
    This paper presents several analytic closed-form approximations of the aggregated interference statistics within the framework of uplink massive machine-type-communications (mMTC), taking into account the random activity of the sensors. Given its discrete nature and the large number of devices involved, a continuous approximation based on the Gram–Charlier series expansion of a truncated Gaussian kernel is proposed. We use this approximation to derive an analytic closed-form expression for the outage probability, corresponding to the event of the signal-to-interference-and-noise ratio being below a detection threshold. This metric is useful since it can be used for evaluating the performance of mMTC systems. We analyze, as an illustrative application of the previous approximation, a scenario with several multi-antenna collector nodes, each equipped with a set of predefined spatial beams. We consider two setups, namely single- and multiple-resource, in reference to the number of resources that are allocated to each beam. A graph-based approach that minimizes the average outage probability, and that is based on the statistics approximation, is used as allocation strategy. Finally, we describe an access protocol where the resource identifiers are broadcast (distributed) through the beams. Numerical simulations prove the accuracy of the approximations and the benefits of the allocation strategy.Peer ReviewedPostprint (published version
    • …
    corecore