876 research outputs found

    QoS enhancement with deep learning-based interference prediction in mobile IoT

    Get PDF
    © 2019 Elsevier B.V. With the acceleration in mobile broadband, wireless infrastructure plays a significant role in Internet-of-Things (IoT) to ensure ubiquitous connectivity in mobile environment, making mobile IoT (mIoT) as center of attraction. Usually intelligent systems are accomplished through mIoT which demands for the increased data traffic. To meet the ever-increasing demands of mobile users, integration of small cells is a promising solution. For mIoT, small cells provide enhanced Quality-of-Service (QoS) with improved data rates. In this paper, mIoT-small cell based network in vehicular environment focusing city bus transit system is presented. However, integrating small cells in vehicles for mIoT makes resource allocation challenging because of the dynamic interference present between small cells which may impact cellular coverage and capacity negatively. This article proposes Threshold Percentage Dependent Interference Graph (TPDIG) using Deep Learning-based resource allocation algorithm for city buses mounted with moving small cells (mSCs). Long–Short Term Memory (LSTM) based neural networks are considered to predict city buses locations for interference determination between mSCs. Comparative analysis of resource allocation using TPDIG, Time Interval Dependent Interference Graph (TIDIG), and Global Positioning System Dependent Interference Graph (GPSDIG) is presented in terms of Resource Block (RB) usage and average achievable data rate of mIoT-mSC network

    Cognitive networking for next generation of cellular communication systems

    Get PDF
    This thesis presents a comprehensive study of cognitive networking for cellular networks with contributions that enable them to be more dynamic, agile, and efficient. To achieve this, machine learning (ML) algorithms, a subset of artificial intelligence, are employed to bring such cognition to cellular networks. More specifically, three major branches of ML, namely supervised, unsupervised, and reinforcement learning (RL), are utilised for various purposes: unsupervised learning is used for data clustering, while supervised learning is employed for predictions on future behaviours of networks/users. RL, on the other hand, is utilised for optimisation purposes due to its inherent characteristics of adaptability and requiring minimal knowledge of the environment. Energy optimisation, capacity enhancement, and spectrum access are identified as primary design challenges for cellular networks given that they are envisioned to play crucial roles for 5G and beyond due to the increased demand in the number of connected devices as well as data rates. Each design challenge and its corresponding proposed solution are discussed thoroughly in separate chapters. Regarding energy optimisation, a user-side energy consumption is investigated by considering Internet of things (IoT) networks. An RL based intelligent model, which jointly optimises the wireless connection type and data processing entity, is proposed. In particular, a Q-learning algorithm is developed, through which the energy consumption of an IoT device is minimised while keeping the requirement of the applications--in terms of response time and security--satisfied. The proposed methodology manages to result in 0% normalised joint cost--where all the considered metrics are combined--while the benchmarks performed 54.84% on average. Next, the energy consumption of radio access networks (RANs) is targeted, and a traffic-aware cell switching algorithm is designed to reduce the energy consumption of a RAN without compromising on the user quality-of-service (QoS). The proposed technique employs a SARSA algorithm with value function approximation, since the conventional RL methods struggle with solving problems with huge state spaces. The results reveal that up to 52% gain on the total energy consumption is achieved with the proposed technique, and the gain is observed to reduce when the scenario becomes more realistic. On the other hand, capacity enhancement is studied from two different perspectives, namely mobility management and unmanned aerial vehicle (UAV) assistance. Towards that end, a predictive handover (HO) mechanism is designed for mobility management in cellular networks by identifying two major issues of Markov chains based HO predictions. First, revisits--which are defined as a situation whereby a user visits the same cell more than once within the same day--are diagnosed as causing similar transition probabilities, which in turn increases the likelihood of making incorrect predictions. This problem is addressed with a structural change; i.e., rather than storing 2-D transition matrix, it is proposed to store 3-D one that also includes HO orders. The obtained results show that 3-D transition matrix is capable of reducing the HO signalling cost by up to 25.37%, which is observed to drop with increasing randomness level in the data set. Second, making a HO prediction with insufficient criteria is identified as another issue with the conventional Markov chains based predictors. Thus, a prediction confidence level is derived, such that there should be a lower bound to perform HO predictions, which are not always advantageous owing to the HO signalling cost incurred from incorrect predictions. The outcomes of the simulations confirm that the derived confidence level mechanism helps in improving the prediction accuracy by up to 8.23%. Furthermore, still considering capacity enhancement, a UAV assisted cellular networking is considered, and an unsupervised learning-based UAV positioning algorithm is presented. A comprehensive analysis is conducted on the impacts of the overlapping footprints of multiple UAVs, which are controlled by their altitudes. The developed k-means clustering based UAV positioning approach is shown to reduce the number of users in outage by up to 80.47% when compared to the benchmark symmetric deployment. Lastly, a QoS-aware dynamic spectrum access approach is developed in order to tackle challenges related to spectrum access, wherein all the aforementioned types of ML methods are employed. More specifically, by leveraging future traffic load predictions of radio access technologies (RATs) and Q-learning algorithm, a novel proactive spectrum sensing technique is introduced. As such, two different sensing strategies are developed; the first one focuses solely on sensing latency reduction, while the second one jointly optimises sensing latency and user requirements. In particular, the proposed Q-learning algorithm takes the future load predictions of the RATs and the requirements of secondary users--in terms of mobility and bandwidth--as inputs and directs the users to the spectrum of the optimum RAT to perform sensing. The strategy to be employed can be selected based on the needs of the applications, such that if the latency is the only concern, the first strategy should be selected due to the fact that the second strategy is computationally more demanding. However, by employing the second strategy, sensing latency is reduced while satisfying other user requirements. The simulation results demonstrate that, compared to random sensing, the first strategy decays the sensing latency by 85.25%, while the second strategy enhances the full-satisfaction rate, where both mobility and bandwidth requirements of the user are simultaneously satisfied, by 95.7%. Therefore, as it can be observed, three key design challenges of the next generation of cellular networks are identified and addressed via the concept of cognitive networking, providing a utilitarian tool for mobile network operators to plug into their systems. The proposed solutions can be generalised to various network scenarios owing to the sophisticated ML implementations, which renders the solutions both practical and sustainable

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Five Facets of 6G: Research Challenges and Opportunities

    Full text link
    Whilst the fifth-generation (5G) systems are being rolled out across the globe, researchers have turned their attention to the exploration of radical next-generation solutions. At this early evolutionary stage we survey five main research facets of this field, namely {\em Facet~1: next-generation architectures, spectrum and services, Facet~2: next-generation networking, Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing, as well as Facet~5: applications of deep learning in 6G networks.} In this paper, we have provided a critical appraisal of the literature of promising techniques ranging from the associated architectures, networking, applications as well as designs. We have portrayed a plethora of heterogeneous architectures relying on cooperative hybrid networks supported by diverse access and transmission mechanisms. The vulnerabilities of these techniques are also addressed and carefully considered for highlighting the most of promising future research directions. Additionally, we have listed a rich suite of learning-driven optimization techniques. We conclude by observing the evolutionary paradigm-shift that has taken place from pure single-component bandwidth-efficiency, power-efficiency or delay-optimization towards multi-component designs, as exemplified by the twin-component ultra-reliable low-latency mode of the 5G system. We advocate a further evolutionary step towards multi-component Pareto optimization, which requires the exploration of the entire Pareto front of all optiomal solutions, where none of the components of the objective function may be improved without degrading at least one of the other components

    Enhancing quality of service in IoT through deep learning techniques

    Get PDF
    When evaluating an Internet of Things (IoT) platform, it is crucial to consider the quality of service (QoS) as a key criterion. With critical devices relying on IoT technology for both personal and business use, ensuring its security is paramount. However, the vast amount of data generated by IoT devices makes it challenging to manage QoS using conventional techniques, particularly when attempting to extract valuable characteristics from the data. To address this issue, we propose a dynamic-progressive deep reinforcement learning (DPDRL) technique to enhance QoS in IoT. Our approach involves collecting and preprocessing data samples before storing them in the IoT cloud and monitoring user access. We evaluate our framework using metrics such as packet loss, throughput, processing delay, and overall system data rate. Our results show that our developed framework achieved a maximum throughput of 94%, indicating its effectiveness in improving QoS. We believe that our deep learning optimization approach can be further utilized in the future to enhance QoS in IoT platforms

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    • …
    corecore