227 research outputs found

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Optimisation de la gestion des interférences inter-cellulaires et de l'attachement des mobiles dans les réseaux cellulaires LTE

    Get PDF
    Driven by an exponential growth in mobile broadband-enabled devices and a continue dincrease in individual data consumption, mobile data traffic has grown 4000-fold over the past 10 years and almost 400-million-fold over the past 15 years. Homogeneouscellular networks have been facing limitations to handle soaring mobile data traffic and to meet the growing end-user demand for more bandwidth and betterquality of experience. These limitations are mainly related to the available spectrumand the capacity of the network. Telecommunication industry has to address these challenges and meet exploding demand. At the same time, it has to guarantee a healthy economic model to reduce the carbon footprint which is caused by mobile communications.Heterogeneous Networks (HetNets), composed of macro base stations and low powerbase stations of different types, are seen as the key solution to improve spectral efficiency per unit area and to eliminate coverage holes. In such networks, intelligent user association and interference management schemes are needed to achieve gains in performance. Due to the large imbalance in transmission power between macroand small cells, user association based on strongest signal received is not adapted inHetNets as only few users would attach to low power nodes. A technique based onCell Individual Offset (CIO) is therefore required to perform load balancing and to favor some Small Cell (SC) attraction against Macro Cell (MC). This offset is addedto users’ Reference Signal Received Power (RSRP) measurements and hence inducing handover towards different eNodeBs. As Long Term Evolution (LTE) cellular networks use the same frequency sub-bands, mobile users may experience strong inter-cellxv interference, especially at cell edge. Therefore, there is a need to coordinate resource allocation among the cells and minimize inter-cell interference. To mitigate stronginter-cell interference, the resource, in time, frequency and power domain, should be allocated efficiently. A pattern for each dimension is computed to permit especially for cell edge users to benefit of higher throughput and quality of experience. The optimization of all these parameters can also offer gain in energy use. In this thesis,we propose a concrete versatile dynamic solution performing an optimization of user association and resource allocation in LTE cellular networks maximizing a certainnet work utility function that can be adequately chosen. Our solution, based on gametheory, permits to compute Cell Individual Offset and a pattern of power transmission over frequency and time domain for each cell. We present numerical simulations toillustrate the important performance gain brought by this optimization. We obtain significant benefits in the average throughput and also cell edge user through put of40% and 55% gains respectively. Furthermore, we also obtain a meaningful improvement in energy efficiency. This work addresses industrial research challenges and assuch, a prototype acting on emulated HetNets traffic has been implemented.Conduit par une croissance exponentielle dans les appareils mobiles et une augmentation continue de la consommation individuelle des données, le trafic de données mobiles a augmenté de 4000 fois au cours des 10 dernières années et près de 400millions fois au cours des 15 dernières années. Les réseaux cellulaires homogènes rencontrent de plus en plus de difficultés à gérer l’énorme trafic de données mobiles et à assurer un débit plus élevé et une meilleure qualité d’expérience pour les utilisateurs.Ces difficultés sont essentiellement liées au spectre disponible et à la capacité du réseau.L’industrie de télécommunication doit relever ces défis et en même temps doit garantir un modèle économique pour les opérateurs qui leur permettra de continuer à investir pour répondre à la demande croissante et réduire l’empreinte carbone due aux communications mobiles. Les réseaux cellulaires hétérogènes (HetNets), composés de stations de base macro et de différentes stations de base de faible puissance,sont considérés comme la solution clé pour améliorer l’efficacité spectrale par unité de surface et pour éliminer les trous de couverture. Dans de tels réseaux, il est primordial d’attacher intelligemment les utilisateurs aux stations de base et de bien gérer les interférences afin de gagner en performance. Comme la différence de puissance d’émission est importante entre les grandes et petites cellules, l’association habituelle des mobiles aux stations de bases en se basant sur le signal le plus fort, n’est plus adaptée dans les HetNets. Une technique basée sur des offsets individuelles par cellule Offset(CIO) est donc nécessaire afin d’équilibrer la charge entre les cellules et d’augmenter l’attraction des petites cellules (SC) par rapport aux cellules macro (MC). Cette offset est ajoutée à la valeur moyenne de la puissance reçue du signal de référence(RSRP) mesurée par le mobile et peut donc induire à un changement d’attachement vers différents eNodeB. Comme les stations de bases dans les réseaux cellulaires LTE utilisent les mêmes sous-bandes de fréquences, les mobiles peuvent connaître une forte interférence intercellulaire, en particulier en bordure de cellules. Par conséquent, il est primordial de coordonner l’allocation des ressources entre les cellules et de minimiser l’interférence entre les cellules. Pour atténuer la forte interférence intercellulaire, les ressources, en termes de temps, fréquence et puissance d’émission, devraient être alloués efficacement. Un modèle pour chaque dimension est calculé pour permettre en particulier aux utilisateurs en bordure de cellule de bénéficier d’un débit plus élevé et d’une meilleure qualité de l’expérience. L’optimisation de tous ces paramètres peut également offrir un gain en consommation d’énergie. Dans cette thèse, nous proposons une solution dynamique polyvalente effectuant une optimisation de l’attachement des mobiles aux stations de base et de l’allocation des ressources dans les réseaux cellulaires LTE maximisant une fonction d’utilité du réseau qui peut être choisie de manière adéquate.Notre solution, basée sur la théorie des jeux, permet de calculer les meilleures valeurs pour l’offset individuelle par cellule (CIO) et pour les niveaux de puissance à appliquer au niveau temporel et fréquentiel pour chaque cellule. Nous présentons des résultats des simulations effectuées pour illustrer le gain de performance important apporté par cette optimisation. Nous obtenons une significative hausse dans le débit moyen et le débit des utilisateurs en bordure de cellule avec 40 % et 55 % de gains respectivement. En outre, on obtient un gain important en énergie. Ce travail aborde des défis pour l’industrie des télécoms et en tant que tel, un prototype de l’optimiseur a été implémenté en se basant sur un trafic HetNets émulé

    Spectrum Sharing, Latency, and Security in 5G Networks with Application to IoT and Smart Grid

    Get PDF
    The surge of mobile devices, such as smartphones, and tables, demands additional capacity. On the other hand, Internet-of-Things (IoT) and smart grid, which connects numerous sensors, devices, and machines require ubiquitous connectivity and data security. Additionally, some use cases, such as automated manufacturing process, automated transportation, and smart grid, require latency as low as 1 ms, and reliability as high as 99.99\%. To enhance throughput and support massive connectivity, sharing of the unlicensed spectrum (3.5 GHz, 5GHz, and mmWave) is a potential solution. On the other hand, to address the latency, drastic changes in the network architecture is required. The fifth generation (5G) cellular networks will embrace the spectrum sharing and network architecture modifications to address the throughput enhancement, massive connectivity, and low latency. To utilize the unlicensed spectrum, we propose a fixed duty cycle based coexistence of LTE and WiFi, in which the duty cycle of LTE transmission can be adjusted based on the amount of data. In the second approach, a multi-arm bandit learning based coexistence of LTE and WiFi has been developed. The duty cycle of transmission and downlink power are adapted through the exploration and exploitation. This approach improves the aggregated capacity by 33\%, along with cell edge and energy efficiency enhancement. We also investigate the performance of LTE and ZigBee coexistence using smart grid as a scenario. In case of low latency, we summarize the existing works into three domains in the context of 5G networks: core, radio and caching networks. Along with this, fundamental constraints for achieving low latency are identified followed by a general overview of exemplary 5G networks. Besides that, a loop-free, low latency and local-decision based routing protocol is derived in the context of smart grid. This approach ensures low latency and reliable data communication for stationary devices. To address data security in wireless communication, we introduce a geo-location based data encryption, along with node authentication by k-nearest neighbor algorithm. In the second approach, node authentication by the support vector machine, along with public-private key management, is proposed. Both approaches ensure data security without increasing the packet overhead compared to the existing approaches
    • …
    corecore