37 research outputs found

    Multi-Service Radio Resource Management for 5G Networks

    Get PDF

    Compressive Sensing-Based Grant-Free Massive Access for 6G Massive Communication

    Full text link
    The advent of the sixth-generation (6G) of wireless communications has given rise to the necessity to connect vast quantities of heterogeneous wireless devices, which requires advanced system capabilities far beyond existing network architectures. In particular, such massive communication has been recognized as a prime driver that can empower the 6G vision of future ubiquitous connectivity, supporting Internet of Human-Machine-Things for which massive access is critical. This paper surveys the most recent advances toward massive access in both academic and industry communities, focusing primarily on the promising compressive sensing-based grant-free massive access paradigm. We first specify the limitations of existing random access schemes and reveal that the practical implementation of massive communication relies on a dramatically different random access paradigm from the current ones mainly designed for human-centric communications. Then, a compressive sensing-based grant-free massive access roadmap is presented, where the evolutions from single-antenna to large-scale antenna array-based base stations, from single-station to cooperative massive multiple-input multiple-output systems, and from unsourced to sourced random access scenarios are detailed. Finally, we discuss the key challenges and open issues to shed light on the potential future research directions of grant-free massive access.Comment: Accepted by IEEE IoT Journa

    Tactful Networking: Humans in the Communication Loop

    Get PDF
    International audienceThis survey discusses the human-perspective into networking through the Tactful Networking paradigm, whose goal is to add perceptive senses to the network by assigning it with human-like capabilities of observation, interpretation, and reaction to daily-life features and associated entities. To achieve this, knowledge extracted from inherent human behavior in terms of routines, personality, interactions, and others is leveraged, empowering the learning and prediction of user needs to improve QoE and system performance while respecting privacy and fostering new applications and services. Tactful Networking groups solutions from literature and innovative interdisciplinary human aspects studied in other areas. The paradigm is motivated by mobile devices' pervasiveness and increasing presence as a sensor in our daily social activities. With the human element in the foreground, it is essential: (i) to center big data analytics around individuals; (ii) to create suitable incentive mechanisms for user participation; (iii) to design and evaluate both humanaware and system-aware networking solutions; and (iv) to apply prior and innovative techniques to deal with human-behavior sensing and learning. This survey reviews the human aspect in networking solutions through over a decade, followed by discussing the tactful networking impact through literature in behavior analysis and representative examples. This paper also discusses a framework comprising data management, analytics, and privacy for enhancing human raw-data to assist Tactful Networking solutions. Finally, challenges and opportunities for future research are presented

    Enabling AI in Future Wireless Networks: A Data Life Cycle Perspective

    Full text link
    Recent years have seen rapid deployment of mobile computing and Internet of Things (IoT) networks, which can be mostly attributed to the increasing communication and sensing capabilities of wireless systems. Big data analysis, pervasive computing, and eventually artificial intelligence (AI) are envisaged to be deployed on top of the IoT and create a new world featured by data-driven AI. In this context, a novel paradigm of merging AI and wireless communications, called Wireless AI that pushes AI frontiers to the network edge, is widely regarded as a key enabler for future intelligent network evolution. To this end, we present a comprehensive survey of the latest studies in wireless AI from the data-driven perspective. Specifically, we first propose a novel Wireless AI architecture that covers five key data-driven AI themes in wireless networks, including Sensing AI, Network Device AI, Access AI, User Device AI and Data-provenance AI. Then, for each data-driven AI theme, we present an overview on the use of AI approaches to solve the emerging data-related problems and show how AI can empower wireless network functionalities. Particularly, compared to the other related survey papers, we provide an in-depth discussion on the Wireless AI applications in various data-driven domains wherein AI proves extremely useful for wireless network design and optimization. Finally, research challenges and future visions are also discussed to spur further research in this promising area.Comment: Accepted at the IEEE Communications Surveys & Tutorials, 42 page

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Machine Learning Meets Communication Networks: Current Trends and Future Challenges

    Get PDF
    The growing network density and unprecedented increase in network traffic, caused by the massively expanding number of connected devices and online services, require intelligent network operations. Machine Learning (ML) has been applied in this regard in different types of networks and networking technologies to meet the requirements of future communicating devices and services. In this article, we provide a detailed account of current research on the application of ML in communication networks and shed light on future research challenges. Research on the application of ML in communication networks is described in: i) the three layers, i.e., physical, access, and network layers; and ii) novel computing and networking concepts such as Multi-access Edge Computing (MEC), Software Defined Networking (SDN), Network Functions Virtualization (NFV), and a brief overview of ML-based network security. Important future research challenges are identified and presented to help stir further research in key areas in this direction

    Multi-TRxPs for Industrial Automation with 5G URLLC Requirements

    Get PDF
    The Fifth Generation (5G) Ultra Reliable Low Latency Communication (URLLC) is envisioned to be one of the most promising drivers for many of the emerging use cases, including industrial automation. In this study, a factory scenario with mobile robots connected via a 5G network with two indoor cells is analyzed. The aim of this study is to analyze how URLLC requirements can be met with the aid of multi-Transmission Reception Points (TRxPs), for a scenario, which is interference limited. By means of simulations, it is shown that availability and reliability can be significantly improved by using multi-TRxPs, especially when the network becomes more loaded. In fact, optimized usage of multi-TRxPs can allow the factory to support a higher capacity while still meeting URLLC requirements. The results indicate that the choice of the number of TRxPs, which simultaneously transmit to a UE, and the locations of the TRxPs around the factory, is of high importance. A poor choice could worsen interference and lower reliability. The general conclusion is that it is best to deploy many TRxPs, but have the UE receive data from only one or maximum two at a time. Additionally, the TRxPs should be distributed enough in the factory to be able to properly improve the received signal, but far enough from the TRxPs of the other cell to limit the additional interference caused

    Intelligent and Efficient Ultra-Dense Heterogeneous Networks for 5G and Beyond

    Get PDF
    Ultra-dense heterogeneous network (HetNet), in which densified small cells overlaying the conventional macro-cells, is a promising technique for the fifth-generation (5G) mobile network. The dense and multi-tier network architecture is able to support the extensive data traffic and diverse quality of service (QoS) but meanwhile arises several challenges especially on the interference coordination and resource management. In this thesis, three novel network schemes are proposed to achieve intelligent and efficient operation based on the deep learning-enabled network awareness. Both optimization and deep learning methods are developed to achieve intelligent and efficient resource allocation in these proposed network schemes. To improve the cost and energy efficiency of ultra-dense HetNets, a hotspot prediction based virtual small cell (VSC) network is proposed. A VSC is formed only when the traffic volume and user density are extremely high. We leverage the feature extraction capabilities of deep learning techniques and exploit a long-short term memory (LSTM) neural network to predict potential hotspots and form VSC. Large-scale antenna array enabled hybrid beamforming is also adaptively adjusted for highly directional transmission to cover these VSCs. Within each VSC, one user equipment (UE) is selected as a cell head (CH), which collects the intra-cell traffic using the unlicensed band and relays the aggregated traffic to the macro-cell base station (MBS) in the licensed band. The inter-cell interference can thus be reduced, and the spectrum efficiency can be improved. Numerical results show that proposed VSCs can reduce 55%55\% power consumption in comparison with traditional small cells. In addition to the smart VSCs deployment, a novel multi-dimensional intelligent multiple access (MD-IMA) scheme is also proposed to achieve stringent and diverse QoS of emerging 5G applications with disparate resource constraints. Multiple access (MA) schemes in multi-dimensional resources are adaptively scheduled to accommodate dynamic QoS requirements and network states. The MD-IMA learns the integrated-quality-of-system-experience (I-QoSE) by monitoring and predicting QoS through the LSTM neural network. The resource allocation in the MD-IMA scheme is formulated as an optimization problem to maximize the I-QoSE as well as minimize the non-orthogonality (NO) in view of implementation constraints. In order to solve this problem, both model-based optimization algorithms and model-free deep reinforcement learning (DRL) approaches are utilized. Simulation results demonstrate that the achievable I-QoSE gain of MD-IMA over traditional MA is 15%15\% - 18%18\%. In the final part of the thesis, a Software-Defined Networking (SDN) enabled 5G-vehicle ad hoc networks (VANET) is designed to support the growing vehicle-generated data traffic. In this integrated architecture, to reduce the signaling overhead, vehicles are clustered under the coordination of SDN and one vehicle in each cluster is selected as a gateway to aggregate intra-cluster traffic. To ensure the capacity of the trunk-link between the gateway and macro base station, a Non-orthogonal Multiplexed Modulation (NOMM) scheme is proposed to split aggregated data stream into multi-layers and use sparse spreading code to partially superpose the modulated symbols on several resource blocks. The simulation results show that the energy efficiency performance of proposed NOMM is around 1.5-2 times than that of the typical orthogonal transmission scheme
    corecore