10,955 research outputs found

    A Comprehensive Survey of Potential Game Approaches to Wireless Networks

    Get PDF
    Potential games form a class of non-cooperative games where unilateral improvement dynamics are guaranteed to converge in many practical cases. The potential game approach has been applied to a wide range of wireless network problems, particularly to a variety of channel assignment problems. In this paper, the properties of potential games are introduced, and games in wireless networks that have been proven to be potential games are comprehensively discussed.Comment: 44 pages, 6 figures, to appear in IEICE Transactions on Communications, vol. E98-B, no. 9, Sept. 201

    Intelligent Wireless Communications Enabled by Cognitive Radio and Machine Learning

    Full text link
    The ability to intelligently utilize resources to meet the need of growing diversity in services and user behavior marks the future of wireless communication systems. Intelligent wireless communications aims at enabling the system to perceive and assess the available resources, to autonomously learn to adapt to the perceived wireless environment, and to reconfigure its operating mode to maximize the utility of the available resources. The perception capability and reconfigurability are the essential features of cognitive radio while modern machine learning techniques project great potential in system adaptation. In this paper, we discuss the development of the cognitive radio technology and machine learning techniques and emphasize their roles in improving spectrum and energy utility of wireless communication systems. We describe the state-of-the-art of relevant techniques, covering spectrum sensing and access approaches and powerful machine learning algorithms that enable spectrum- and energy-efficient communications in dynamic wireless environments. We also present practical applications of these techniques and identify further research challenges in cognitive radio and machine learning as applied to the existing and future wireless communication systems

    Application of Machine Learning in Wireless Networks: Key Techniques and Open Issues

    Full text link
    As a key technique for enabling artificial intelligence, machine learning (ML) is capable of solving complex problems without explicit programming. Motivated by its successful applications to many practical tasks like image recognition, both industry and the research community have advocated the applications of ML in wireless communication. This paper comprehensively surveys the recent advances of the applications of ML in wireless communication, which are classified as: resource management in the MAC layer, networking and mobility management in the network layer, and localization in the application layer. The applications in resource management further include power control, spectrum management, backhaul management, cache management, beamformer design and computation resource management, while ML based networking focuses on the applications in clustering, base station switching control, user association and routing. Moreover, literatures in each aspect is organized according to the adopted ML techniques. In addition, several conditions for applying ML to wireless communication are identified to help readers decide whether to use ML and which kind of ML techniques to use, and traditional approaches are also summarized together with their performance comparison with ML based approaches, based on which the motivations of surveyed literatures to adopt ML are clarified. Given the extensiveness of the research area, challenges and unresolved issues are presented to facilitate future studies, where ML based network slicing, infrastructure update to support ML based paradigms, open data sets and platforms for researchers, theoretical guidance for ML implementation and so on are discussed.Comment: 34 pages,8 figure

    Complex Systems Science meets 5G and IoT

    Full text link
    We propose a new paradigm for telecommunications, and develop a framework drawing on concepts from information (i.e., different metrics of complexity) and computational (i.e., agent based modeling) theory, adapted from complex system science. We proceed in a systematic fashion by dividing network complexity understanding and analysis into different layers. Modelling layer forms the foundation of the proposed framework, supporting analysis and tuning layers. The modelling layer aims at capturing the significant attributes of networks and the interactions that shape them, through the application of tools such as agent-based modelling and graph theoretical abstractions, to derive new metrics that holistically describe a network. The analysis phase completes the core functionality of the framework by linking our new metrics to the overall network performance. The tuning layer augments this core with algorithms that aim at automatically guiding networks toward desired conditions. In order to maximize the impact of our ideas, the proposed approach is rooted in relevant, near-future architectures and use cases in 5G networks, i.e., Internet of Things (IoT) and self-organizing cellular networks

    Energy efficient D2D communications in dynamic TDD systems

    Full text link
    Network-assisted device-to-device communication is a promising technology for improving the performance of proximity-based services. This paper demonstrates how the integration of device-to-device communications and dynamic time-division duplex can improve the energy efficiency of future cellular networks, leading to a greener system operation and a prolonged battery lifetime of mobile devices. We jointly optimize the mode selection, transmission period and power allocation to minimize the energy consumption (from both a system and a device perspective) while satisfying a certain rate requirement. The radio resource management problems are formulated as mixed-integer nonlinear programming problems. Although they are known to be NP-hard in general, we exploit the problem structure to design efficient algorithms that optimally solve several problem cases. For the remaining cases, a heuristic algorithm that computes near-optimal solutions while respecting practical constraints on execution times and signaling overhead is also proposed. Simulation results confirm that the combination of device-to-device and flexible time-division-duplex technologies can significantly enhance spectrum and energy-efficiency of next generation cellular systems.Comment: Submitted to IEEE Journal of Selected Areas in Communication

    The Convergence of Machine Learning and Communications

    Full text link
    The areas of machine learning and communication technology are converging. Today's communications systems generate a huge amount of traffic data, which can help to significantly enhance the design and management of networks and communication components when combined with advanced machine learning methods. Furthermore, recently developed end-to-end training procedures offer new ways to jointly optimize the components of a communication system. Also in many emerging application fields of communication technology, e.g., smart cities or internet of things, machine learning methods are of central importance. This paper gives an overview over the use of machine learning in different areas of communications and discusses two exemplar applications in wireless networking. Furthermore, it identifies promising future research topics and discusses their potential impact.Comment: 8 pages, 4 figure

    D2D Enhanced Heterogeneous Cellular Networks with Dynamic TDD

    Full text link
    Over the last decade, the growing amount of UL and DL mobile data traffic has been characterized by substantial asymmetry and time variations. Dynamic time-division duplex (TDD) has the capability to accommodate to the traffic asymmetry by adapting the UL/DL configuration to the current traffic demands. In this work, we study a two-tier heterogeneous cellular network (HCN) where the macro tier and small cell tier operate according to a dynamic TDD scheme on orthogonal frequency bands. To offload the network infrastructure, mobile users in proximity can engage in D2D communications, whose activity is determined by a carrier sensing multiple access (CSMA) scheme to protect the ongoing infrastructure-based and D2D transmissions. We present an analytical framework to evaluate the network performance in terms of load-aware coverage probability and network throughput. The proposed framework allows to quantify the effect on the coverage probability of the most important TDD system parameters, such as the UL/DL configuration, the base station density, and the bias factor. In addition, we evaluate how the bandwidth partition and the D2D network access scheme affect the total network throughput. Through the study of the tradeoff between coverage probability and D2D user activity, we provide guidelines for the optimal design of D2D network access.Comment: 15 pages; 9 figures; submitted to IEEE Transactions on Wireless Communication

    NOMA based resource allocation and mobility enhancement framework for IoT in next generation cellular networks

    Get PDF
    With the unprecedented technological advances witnessed in the last two decades, more devices are connected to the internet, forming what is called internet of things (IoT). IoT devices with heterogeneous characteristics and quality of experience (QoE) requirements may engage in dynamic spectrum market due to scarcity of radio resources. We propose a framework to efficiently quantify and supply radio resources to the IoT devices by developing intelligent systems. The primary goal of the paper is to study the characteristics of the next generation of cellular networks with non-orthogonal multiple access (NOMA) to enable connectivity to clustered IoT devices. First, we demonstrate how the distribution and QoE requirements of IoT devices impact the required number of radio resources in real time. Second, we prove that using an extended auction algorithm by implementing a series of complementary functions, enhance the radio resource utilization efficiency. The results show substantial reduction in the number of sub-carriers required when compared to conventional orthogonal multiple access (OMA) and the intelligent clustering is scalable and adaptable to the cellular environment. Ability to move spectrum usages from one cluster to other clusters after borrowing when a cluster has less user or move out of the boundary is another soft feature that contributes to the reported radio resource utilization efficiency. Moreover, the proposed framework provides IoT service providers cost estimation to control their spectrum acquisition to achieve required quality of service (QoS) with guaranteed bit rate (GBR) and non-guaranteed bit rate (Non-GBR)

    A baseband wireless spectrum hypervisor for multiplexing concurrent OFDM signals

    Get PDF
    The next generation of wireless and mobile networks will have to handle a significant increase in traffic load compared to the current ones. This situation calls for novel ways to increase the spectral efficiency. Therefore, in this paper, we propose a wireless spectrum hypervisor architecture that abstracts a radio frequency (RF) front-end into a configurable number of virtual RF front ends. The proposed architecture has the ability to enable flexible spectrum access in existing wireless and mobile networks, which is a challenging task due to the limited spectrum programmability, i.e., the capability a system has to change the spectral properties of a given signal to fit an arbitrary frequency allocation. The proposed architecture is a non-intrusive and highly optimized wireless hypervisor that multiplexes the signals of several different and concurrent multi-carrier-based radio access technologies with numerologies that are multiple integers of one another, which are also referred in our work as radio access technologies with correlated numerology. For example, the proposed architecture can multiplex the signals of several Wi-Fi access points, several LTE base stations, several WiMAX base stations, etc. As it able to multiplex the signals of radio access technologies with correlated numerology, it can, for instance, multiplex the signals of LTE, 5G-NR and NB-IoT base stations. It abstracts a radio frequency front-end into a configurable number of virtual RF front ends, making it possible for such different technologies to share the same RF front-end and consequently reduce the costs and increasing the spectral efficiency by employing densification, once several networks share the same infrastructure or by dynamically accessing free chunks of spectrum. Therefore, the main goal of the proposed approach is to improve spectral efficiency by efficiently using vacant gaps in congested spectrum bandwidths or adopting network densification through infrastructure sharing. We demonstrate mathematically how our proposed approach works and present several simulation results proving its functionality and efficiency. Additionally, we designed and implemented an open-source and free proof of concept prototype of the proposed architecture, which can be used by researchers and developers to run experiments or extend the concept to other applications. We present several experimental results used to validate the proposed prototype. We demonstrate that the prototype can easily handle up to 12 concurrent physical layers

    Dynamic Computation Offloading for Mobile-Edge Computing with Energy Harvesting Devices

    Full text link
    Mobile-edge computing (MEC) is an emerging paradigm to meet the ever-increasing computation demands from mobile applications. By offloading the computationally intensive workloads to the MEC server, the quality of computation experience, e.g., the execution latency, could be greatly improved. Nevertheless, as the on-device battery capacities are limited, computation would be interrupted when the battery energy runs out. To provide satisfactory computation performance as well as achieving green computing, it is of significant importance to seek renewable energy sources to power mobile devices via energy harvesting (EH) technologies. In this paper, we will investigate a green MEC system with EH devices and develop an effective computation offloading strategy. The execution cost, which addresses both the execution latency and task failure, is adopted as the performance metric. A low-complexity online algorithm, namely, the Lyapunov optimization-based dynamic computation offloading (LODCO) algorithm is proposed, which jointly decides the offloading decision, the CPU-cycle frequencies for mobile execution, and the transmit power for computation offloading. A unique advantage of this algorithm is that the decisions depend only on the instantaneous side information without requiring distribution information of the computation task request, the wireless channel, and EH processes. The implementation of the algorithm only requires to solve a deterministic problem in each time slot, for which the optimal solution can be obtained either in closed form or by bisection search. Moreover, the proposed algorithm is shown to be asymptotically optimal via rigorous analysis. Sample simulation results shall be presented to verify the theoretical analysis as well as validate the effectiveness of the proposed algorithm.Comment: 33 pages, 11 figures, submitted to IEEE Journal on Selected Areas in Communication
    • …
    corecore