54 research outputs found

    Resource Allocation in Next Generation Mobile Networks

    Get PDF
    The increasing heterogeneity of the mobile network infrastructure together with the explosively growing demand for bandwidth-hungry services with diverse quality of service (QoS) requirements leads to a degradation in the performance of traditional networks. To address this issue in next-generation mobile networks (NGMN), various technologies such as software-defined networking (SDN), network function virtualization (NFV), mobile edge/cloud computing (MEC/MCC), non-terrestrial networks (NTN), and edge ML are essential. Towards this direction, an optimal allocation and management of heterogeneous network resources to achieve the required low latency, energy efficiency, high reliability, enhanced coverage and connectivity, etc. is a key challenge to be solved urgently. In this dissertation, we address four critical and challenging resource allocation problems in NGMN and propose efficient solutions to tackle them. In the first part, we address the network slice resource provisioning problem in NGMN for delivering a wide range of services promised by 5G systems and beyond, including enhanced mobile broadband (eMBB), ultra-reliable and low latency (URLLC), and massive machine-type communication (mMTC). Network slicing is one of the major solutions needed to meet the differentiated service requirements of NGMN, under one common network infrastructure. Towards robust mobile network slicing, we propose a novel approach for the end-to-end (E2E) resource allocation in a realistic scenario with uncertainty in slices' demands using stochastic programming. The effectiveness of our proposed methodology is validated through simulations. Despite the significant benefits that network slicing has demonstrated to bring to the management and performance of NGMN, the real-time response required by many emerging delay-sensitive applications, such as autonomous driving, remote health, and smart manufacturing, necessitates the integration of multi-access edge computing (MEC) into network sliding for 5G networks and beyond. To this end, we discuss a novel collaborative cloud-edge-local computation offloading scheme in the next two parts of this dissertation. The first part studies the problem from the perspective of the infrastructure provider and shows the effectiveness of the proposed approach in addressing the rising number of latency-sensitive services and improving energy efficiency which has become a primary concern in NGMN. Moreover, taking into account the perspective of application (higher layer), we propose a novel framework for the optimal reservation of resources by applications, resulting in significant resource savings and reduced cost. The proposed method utilizes application-specific resource coupling relationships modeled using linear regression analysis. We further improve this approach by using Reinforcement Learning to automatically derive resource coupling functions in dynamic environments. Enhanced connectivity and coverage are other key objectives of NGMN. In this regard, unmanned aerial vehicles (UAVs) have been extensively utilized to provide wireless connectivity in rural and under-developed areas, enhance network capacity, and provide support for peaks or unexpected surges in user demand. The popularity of UAVs in such scenarios is mainly owing to their fast deployment, cost-efficiency, and superior communication performance resulting from line-of-sight (LoS)-dominated wireless channels. In the fifth part of this dissertation, we formulate the problem of aerial platform resource allocation and traffic routing in multi-UAV relaying systems wherein UAVs are deployed as flying base stations. Our proposed solution is shown to improve the supported traffic with minimum deployment cost. Moreover, the new breed of intelligent devices and applications such as UAVs, AR/VR, remote health, autonomous vehicles, etc. requires a novel paradigm shift from traditional cloud-based learning to a distributed, low-latency, and reliable ML at the network edge. To this end, Federated Learning (FL) has been proposed as a new learning scheme that enables devices to collaboratively learn a shared model while keeping the training data locally. However, the performance of FL is significantly affected by various security threats such as data and model poisoning attacks. Towards reliable edge learning, in the last part of this dissertation, we propose trust as a metric to measure the trustworthiness of the FL agents and thereby enhance the reliability of FL

    A Gentle Introduction to Reinforcement Learning and its Application in Different Fields

    Get PDF
    Due to the recent progress in Deep Neural Networks, Reinforcement Learning (RL) has become one of the most important and useful technology. It is a learning method where a software agent interacts with an unknown environment, selects actions, and progressively discovers the environment dynamics. RL has been effectively applied in many important areas of real life. This article intends to provide an in-depth introduction of the Markov Decision Process, RL and its algorithms. Moreover, we present a literature review of the application of RL to a variety of fields, including robotics and autonomous control, communication and networking, natural language processing, games and self-organized system, scheduling management and configuration of resources, and computer vision

    Learning and Reasoning Strategies for User Association in Ultra-dense Small Cell Vehicular Networks

    Get PDF
    Recent vehicular ad hoc networks research has been focusing on providing intelligent transportation services by employing information and communication technologies on road transport. It has been understood that advanced demands such as reliable connectivity, high user throughput, and ultra-low latency required by these services cannot be met using traditional communication technologies. Consequently, this thesis reports on the application of artificial intelligence to user association as a technology enabler in ultra-dense small cell vehicular networks. In particular, the work focuses on mitigating mobility-related concerns and networking issues at different mobility levels by employing diverse heuristic as well as reinforcement learning (RL) methods. Firstly, driven by rapid fluctuations in the network topology and the radio environment, a conventional, three-step sequence user association policy is designed to highlight and explore the impact of vehicle speed and different performance indicators on network quality of service (QoS) and user experience. Secondly, inspired by control-theoretic models and dynamic programming, a real-time controlled feedback user association approach is proposed. The algorithm adapts to the changing vehicular environment by employing derived network performance information as a heuristic, resulting in improved network performance. Thirdly, a sequence of novel RL based user association algorithms are developed that employ variable learning rate, variable rewards function and adaptation of the control feedback framework to improve the initial and steady-state learning performance. Furthermore, to accelerate the learning process and enhance the adaptability and robustness of the developed RL algorithms, heuristically accelerated RL and case-based transfer learning methods are employed. A comprehensive, two-tier, event-based, system level simulator which is an integration of a dynamic vehicular network, a highway, and an ultra-dense small cell network is developed. The model has enabled the analysis of user mobility effects on the network performance across different mobility levels as well as served as a firm foundation for the evaluation of the empirical properties of the investigated approaches

    Recent Advances in Cellular D2D Communications

    Get PDF
    Device-to-device (D2D) communications have attracted a great deal of attention from researchers in recent years. It is a promising technique for offloading local traffic from cellular base stations by allowing local devices, in physical proximity, to communicate directly with each other. Furthermore, through relaying, D2D is also a promising approach to enhancing service coverage at cell edges or in black spots. However, there are many challenges to realizing the full benefits of D2D. For one, minimizing the interference between legacy cellular and D2D users operating in underlay mode is still an active research issue. With the 5th generation (5G) communication systems expected to be the main data carrier for the Internet-of-Things (IoT) paradigm, the potential role of D2D and its scalability to support massive IoT devices and their machine-centric (as opposed to human-centric) communications need to be investigated. New challenges have also arisen from new enabling technologies for D2D communications, such as non-orthogonal multiple access (NOMA) and blockchain technologies, which call for new solutions to be proposed. This edited book presents a collection of ten chapters, including one review and nine original research works on addressing many of the aforementioned challenges and beyond

    Internet of Things 2.0: Concepts, Applications, and Future Directions

    Full text link
    Applications and technologies of the Internet of Things are in high demand with the increase of network devices. With the development of technologies such as 5G, machine learning, edge computing, and Industry 4.0, the Internet of Things has evolved. This survey article discusses the evolution of the Internet of Things and presents the vision for Internet of Things 2.0. The Internet of Things 2.0 development is discussed across seven major fields. These fields are machine learning intelligence, mission critical communication, scalability, energy harvesting-based energy sustainability, interoperability, user friendly IoT, and security. Other than these major fields, the architectural development of the Internet of Things and major types of applications are also reviewed. Finally, this article ends with the vision and current limitations of the Internet of Things in future network environments

    ADAPTIVE POWER MANAGEMENT FOR COMPUTERS AND MOBILE DEVICES

    Get PDF
    Power consumption has become a major concern in the design of computing systems today. High power consumption increases cooling cost, degrades the system reliability and also reduces the battery life in portable devices. Modern computing/communication devices support multiple power modes which enable power and performance tradeoff. Dynamic power management (DPM), dynamic voltage and frequency scaling (DVFS), and dynamic task migration for workload consolidation are system level power reduction techniques widely used during runtime. In the first part of the dissertation, we concentrate on the dynamic power management of the personal computer and server platform where the DPM, DVFS and task migrations techniques are proved to be highly effective. A hierarchical energy management framework is assumed, where task migration is applied at the upper level to improve server utilization and energy efficiency, and DPM/DVFS is applied at the lower level to manage the power mode of individual processor. This work focuses on estimating the performance impact of workload consolidation and searching for optimal DPM/DVFS that adapts to the changing workload. Machine learning based modeling and reinforcement learning based policy optimization techniques are investigated. Mobile computing has been weaved into everyday lives to a great extend in recent years. Compared to traditional personal computer and server environment, the mobile computing environment is obviously more context-rich and the usage of mobile computing device is clearly imprinted with user\u27s personal signature. The ability to learn such signature enables immense potential in workload prediction and energy or battery life management. In the second part of the dissertation, we present two mobile device power management techniques which take advantage of the context-rich characteristics of mobile platform and make adaptive energy management decisions based on different user behavior. We firstly investigate the user battery usage behavior modeling and apply the model directly for battery energy management. The first technique aims at maximizing the quality of service (QoS) while keeping the risk of battery depletion below a given threshold. The second technique is an user-aware streaming strategies for energy efficient smartphone video playback applications (e.g. YouTube) that minimizes the sleep and wake penalty of cellular module and at the same time avoid the energy waste from excessive downloading. Runtime power and thermal management has attracted substantial interests in multi-core distributed embedded systems. Fast performance evaluation is an essential step in the research of distributed power and thermal management. In last part of the dissertation, we present an FPGA based emulator of multi-core distributed embedded system designed to support the research in runtime power/thermal management. Hardware and software supports are provided to carry out basic power/thermal management actions including inter-core or inter-FPGA communications, runtime temperature monitoring and dynamic frequency scaling

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial
    • …
    corecore