7,088 research outputs found

    Performance Study and Enhancement of Access Barring for Massive Machine-Type Communications

    Full text link
    [EN] Machine-type communications (MTC) is an emerging technology that boosts the development of the Internet of Things by providing ubiquitous connectivity and services. Cellular networks are an excellent choice for providing such hyper-connectivity thanks to their widely deployed infrastructure, among other features. However, dealing with a large number of connection requests is a primary challenge in the cellular-based MTC. Severe congestion episodes can occur when a large number of devices try to access the network almost simultaneously. Extended access barring (EAB) is a congestion control mechanism for the MTC that has been proposed by the 3GPP. In this paper, we carry out a thorough performance analysis of the EAB and show the limitations of its current specification. To overcome these limitations, we propose the two enhanced EAB schemes: the combined use of the EAB and access class barring, and the introduction of a congestion avoidance backoff after the barring status of a UE is switched to unbarred. It is shown through extensive simulations that our proposed solutions improve the key performance indicators. A high successful access probability can be achieved even in heavily congested scenarios, the access delay is shortened, and, most importantly, the number of required preamble retransmissions is reduced, which results in significant energy savings. Furthermore, we present an accurate congestion estimation method that solely relies on the information available at the base station. We show that this method permits a realistic and effective implementation of the EAB.This work was supported in part by the Ministerio de Ciencia, Innovacion y Universidades (MCIU), Agencia Estatal de Investigacion (AEI) y Fondo Europeo de Desarrollo Regional (FEDER), UE, under Grant PGC2018-094151-B-I00, and in part by the ITACA Institute under Grant Ayudas ITACA 2019Vidal Catalá, JR.; Tello-Oquendo, L.; Pla, V.; Guijarro, L. (2019). Performance Study and Enhancement of Access Barring for Massive Machine-Type Communications. IEEE Access. 7:63745-63759. https://doi.org/10.1109/ACCESS.2019.2917618S6374563759

    Contention Resolution Queues for Massive Machine Type Communications in LTE

    Get PDF
    In this paper, we address the challenge of high device density performing simultaneous transmissions by proposing and evaluating a solution to efficiently handle the initial access contention for highly dense LTE networks. We present the implementation of a tree-splitting algorithm in the access procedure of LTE, which is capable to cope with high number of simultaneous arrivals. Based on simulations we show a feasible implementation capable to achieve, under certain network configuration conditions, up to 85% average access delay reduction and 40% reduction on the average energy consumption, while maintaining a consistently low blocking probability, regardless of the number of initial simultaneous access attempts

    D2D-Based Grouped Random Access to Mitigate Mobile Access Congestion in 5G Sensor Networks

    Full text link
    The Fifth Generation (5G) wireless service of sensor networks involves significant challenges when dealing with the coordination of ever-increasing number of devices accessing shared resources. This has drawn major interest from the research community as many existing works focus on the radio access network congestion control to efficiently manage resources in the context of device-to-device (D2D) interaction in huge sensor networks. In this context, this paper pioneers a study on the impact of D2D link reliability in group-assisted random access protocols, by shedding the light on beneficial performance and potential limitations of approaches of this kind against tunable parameters such as group size, number of sensors and reliability of D2D links. Additionally, we leverage on the association with a Geolocation Database (GDB) capability to assist the grouping decisions by drawing parallels with recent regulatory-driven initiatives around GDBs and arguing benefits of the suggested proposal. Finally, the proposed method is approved to significantly reduce the delay over random access channels, by means of an exhaustive simulation campaign.Comment: First submission to IEEE Communications Magazine on Oct.28.2017. Accepted on Aug.18.2019. This is the camera-ready versio

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Restriccion de Acceso Extendida para Manejar Despliegues Masivos de ¿ Comunicaciones Tipo Maquina (mMTC)

    Full text link
    [ES] La comunicacion masiva tipo m ¿ aquina (mMTC) ha presentado un momento promete- ¿ dor para generar conexiones potentes y ubicuas que enfrentan muchos desaf¿¿os nuevos. Las redes celulares son la solucion potencial debido a su amplio despliegue de in- ¿ fraestructura, confiabilidad, seguridad y eficiencia. En las redes mMTC basadas en comunicacion celular, el canal de acceso aleatorio se utiliza para establecer la conexi ¿ on¿ entre los dispositivos MTC y las estaciones base (eNBs), donde el principal desaf¿¿o es la conectividad escalable y eficiente para una enorme cantidad de dispositivos. Para hacer frente a esto, el Third Generation Partnership Project (3GPP) ha sugerido la restriccion de acceso extendida (EAB) como un mecanismo para el control de la con- ¿ gestion. Las eNBs activan o desactivan EAB utilizando un coeficiente de congesti ¿ on. En ¿ este documento se presenta un enfoque para implementar el coeficiente de congestion¿ de modo que EAB pueda operar y as¿¿ manejar los episodios de congestion en escenarios ¿ de mMTC. Tambien se examina el rendimiento de EAB bajo diferentes cargas de tr ¿ afico ¿ de MTC y configuraciones de ciclo de paginacion en t ¿ erminos de indicadores clave de ¿ rendimiento de la red (KPIs). Los resultados numericos demuestran la efectividad del ¿ metodo propuesto para detectar episodios de congesti ¿ on. Adem ¿ as se demuestra que el ¿ aumento del valor de la configuracion del ciclo de paginaci ¿ on influye en el compor- ¿ tamiento de la red bajo EAB.[EN] Massive machine type communication (mMTC) has presented a promising moment to generate powerful and ubiquitous connections that face plenty of new challenges. Cellular networks are the potential solution owing to their extensive infrastructure deployment, reliability, security, and efficiency. In cellular-based mMTC networks, the random access channel is used to establish the connection between MTC devices and base stations (eNBs), where the scalable and efficient connectivity for a tremendous number of devices is the primary challenge. To deal with this, the Third Generation Partnership Project (3GPP) has suggested the extended access barring (EAB) as a mechanism for congestion control. The eNBs activate or deactivate EAB using a congestion coefficient. In this paper, an approach to implementing the congestion coefficient is presented so that EAB can operate thus handling congestion episodes in mMTC scenarios. Moreover, the performance of EAB is examined under different MTC traffic loads and paging cycle configurations concerning network key performance indicators (KPIs). Numerical results demonstrate the effectiveness of the proposed method to detect congestion episodes. Also, it is shown that increasing the value of the paging cycle configuration influence on the network behavior under EAB.This work was supported in part by the Ministry of Economy and Competitiveness of Spain under Grants TIN2013-47272-C2-1-R and TEC2015- 71932-REDT.Tello-Oquendo, L.; Vidal Catalá, JR.; Pla, V.; Martínez Bauset, J. (2018). Extended Access Barring for Handling Massive Machine Type Communication (mMTC) Deployments. NOVASINERGIA. 1(2):38-44. http://hdl.handle.net/10251/121752S38441

    Mobile network connectivity analysis for device to device communication in 5G network

    Get PDF
    Since long term evolved release 14 (LTE R14), the device to device (D2D) communications have become a promising technology for in-band or out-band mobile communication networks. In addition, D2D communications constitute an essential component of the fifth-generation mobile network (5G). For example, to improve capability communication, reduce the power dissipation, reduce latency within the networks and implement new applications and services. However, reducing the congestion in D2D communications and improving the mobile network connectivity are the essential problems to propose these new applications or services. This paper presents new solutions to reduce the congestion of devices around a base station and improve the performance of the D2D network; in terms of the number of connected devices or user equipment (UE). The simulation results show that our proposed solution can improve the network capacity by doubling the number of connected devices (or UE) and reducing the congestion. For this reason, our proposition makes it possible to reduce the financial cost by reducing the cost of deploying equipment. For example, instead of using two base stations, we can use only one station to connect the same number of devices

    Traffic classification and prediction, and fast uplink grant allocation for machine type communications via support vector machines and long short-term memory

    Get PDF
    Abstract. The current random access (RA) allocation techniques suffer from congestion and high signaling overhead while serving machine type communication (MTC) applications. Therefore, 3GPP has introduced the need to use fast uplink grant (FUG) allocation. This thesis proposes a novel FUG allocation based on support vector machine (SVM) and long short-term memory (LSTM). First, MTC devices are prioritized using SVM classifier. Second, LSTM architecture is used to predict activation time of each device. Both results are used to achieve an efficient resource scheduler in terms of the average latency and total throughput. Furthermore, a set of correction techniques is introduced to overcome the classification and prediction errors. The Coupled Markov Modulated Poisson Process (CMMPP) traffic model is applied to compare the proposed FUG allocation to other existing allocation techniques. In addition, an extended traffic model based CMMPP is used to evaluate the proposed algorithm in a more dense network. Our simulation results show the proposed model outperforms the existing RA allocation schemes by achieving the highest throughput and the lowest access delay when serving the target massive and critical MTC applications
    corecore