46 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Reduced complexity optimal resource allocation for enhanced video quality in a heterogeneous network environment

    Get PDF
    The latest Heterogeneous Network (HetNet) environments, supported by 5th generation (5G) network solutions, include small cells deployed to increase the traditional macrocell network performance. In HetNet environments, before data transmission starts, there is a user association (UA) process with a specific base station (BS). Additionally, during data transmission, diverse resource allocation (RA) schemes are employed. UA-RA solutions play a critical role in improving network load balancing, spectral performance, and energy efficiency. Although several studies have examined the joint UA-RA problem, there is no optimal strategy to address it with low complexity while also reducing the time overhead. We propose two different versions of simulated annealing (SA): Reduced Search Space SA (RS3A) and Performance-Improved Reduced Search Space SA (P IRS3A), algorithms for solving UA-RA problem in HetNets. First, the UA-RA problem is formulated as a multiple knapsack problem (MKP) with constraints on the maximum BS capacity and transport block size (TBS) index. Second, the proposed RS3A and P IRS3A are used to solve the formulated MKP. Simulation results show that the proposed scheme P IRS3A outperforms RS3A and other existing schemes such as Default Simulated Annealing (DSA), and Default Genetic Algorithm (DGA) in terms of variability and DSA and RS3A in terms of Quality of Service (QoS) metrics, including throughput, packet loss ratio (PLR), delay and jitter. Simulation results show that P IRS3A generates solutions that are very close to the optimal solution

    Intelligence in 5G networks

    Get PDF
    Over the past decade, Artificial Intelligence (AI) has become an important part of our daily lives; however, its application to communication networks has been partial and unsystematic, with uncoordinated efforts that often conflict with each other. Providing a framework to integrate the existing studies and to actually build an intelligent network is a top research priority. In fact, one of the objectives of 5G is to manage all communications under a single overarching paradigm, and the staggering complexity of this task is beyond the scope of human-designed algorithms and control systems. This thesis presents an overview of all the necessary components to integrate intelligence in this complex environment, with a user-centric perspective: network optimization should always have the end goal of improving the experience of the user. Each step is described with the aid of one or more case studies, involving various network functions and elements. Starting from perception and prediction of the surrounding environment, the first core requirements of an intelligent system, this work gradually builds its way up to showing examples of fully autonomous network agents which learn from experience without any human intervention or pre-defined behavior, discussing the possible application of each aspect of intelligence in future networks

    A survey of multi-access edge computing in 5G and beyond : fundamentals, technology integration, and state-of-the-art

    Get PDF
    Driven by the emergence of new compute-intensive applications and the vision of the Internet of Things (IoT), it is foreseen that the emerging 5G network will face an unprecedented increase in traffic volume and computation demands. However, end users mostly have limited storage capacities and finite processing capabilities, thus how to run compute-intensive applications on resource-constrained users has recently become a natural concern. Mobile edge computing (MEC), a key technology in the emerging fifth generation (5G) network, can optimize mobile resources by hosting compute-intensive applications, process large data before sending to the cloud, provide the cloud-computing capabilities within the radio access network (RAN) in close proximity to mobile users, and offer context-aware services with the help of RAN information. Therefore, MEC enables a wide variety of applications, where the real-time response is strictly required, e.g., driverless vehicles, augmented reality, robotics, and immerse media. Indeed, the paradigm shift from 4G to 5G could become a reality with the advent of new technological concepts. The successful realization of MEC in the 5G network is still in its infancy and demands for constant efforts from both academic and industry communities. In this survey, we first provide a holistic overview of MEC technology and its potential use cases and applications. Then, we outline up-to-date researches on the integration of MEC with the new technologies that will be deployed in 5G and beyond. We also summarize testbeds and experimental evaluations, and open source activities, for edge computing. We further summarize lessons learned from state-of-the-art research works as well as discuss challenges and potential future directions for MEC research
    corecore