11 research outputs found

    Predictive resource allocation in the LTE uplink for event based M2M applications

    Get PDF
    For certain event based M2M applications, it is possible to predict when devices will or may need to send data on the LTE uplink. For example, in a wireless sensor network, the fact that one sensor has triggered may increase the probability that other sensors in the vicinity may also trigger in quick succession. The existing reactive LTE uplink access protocol, in which a device with pending data sends a scheduling request to the eNodeB at its next scheduled opportunity, and the eNodeB responds with an uplink grant, can lead to high latencies. This is particularly the case when the system utilizes a high scheduling request period (of up to 80ms) to support a large number of devices in a cell, which is characteristic of M2M deployments. In this paper, we introduce, analyze and simulate a new predictive/proactive resource allocation scheme for the LTE uplink for use with event based M2M applications. In this scheme, when one device in a group sends a scheduling request, the eNodeB identifies neighbor devices in the same group which may benefit from a predictive resource allocation in lieu of waiting for those neighbors to send a scheduling request at their next scheduled opportunity. We demonstrate how the minimum uplink latency can be reduced from 6ms to 5ms and how the mean uplink latency can be reduced by greater than 50% (in certain scenarios) using this method

    Cost-efficient Low Latency Communication Infrastructure for Synchrophasor Applications in Smart Grids

    Get PDF
    With the introduction of distributed renewable energy resources and new loads, such as electric vehicles, the power grid is evolving to become a highly dynamic system, that necessitates continuous and fine-grained observability of its operating conditions. In the context of the medium voltage (MV) grid, this has motivated the deployment of Phasor Measurement Units (PMUs), that offer high precision synchronized grid monitoring, enabling mission-critical applications such as fault detection/location. However, PMU-based applications present stringent delay requirements, raising a significant challenge to the communication infrastructure. In contrast to the high voltage domain, there is no clear vision for the communication and network topologies for the MV grid; a full fledged optical fiber-based communication infrastructure is a costly approach due to the density of PMUs required. In this work, we focus on the support of low-latency PMU-based applications in the MV domain, identifying and addressing the trade-off between communication infrastructure deployment costs and the corresponding performance. We study a large set of real MV grid topologies to get an in-depth understanding of the various key latency factors. Building on the gained insights, we propose three algorithms for the careful placement of high capacity links, targeting a balance between deployment costs and achieved latencies. Extensive simulations demonstrate that the proposed algorithms result in low-latency network topologies while reducing deployment costs by up to 80% in comparison to a ubiquitous deployment of costly high capacity links

    The State of the Art in Smart Grid Domain: A Network Modeling Approach

    Get PDF
    Agent-based computing and multi-agent systems are important tools in the domain of smart grid. Various properties of agents like self-organization, co-operation, autonomous behavior, and many others allow researchers to well represent the smart grid applications and models. From past few decades, various research attempts have been made in the smart grid domain by adopting the agent-based computing technology. The research publications are growing in number which makes it difficult to locate and identify the dynamics and trends in the research. Scientometric analysis is a useful tool to perform a comprehensive bibliographic review. It allows not only to understand the key areas of research but also provide visual representation of each entity involve in the research. In this study, we provide a detailed statistical as well as visual analysis of agent-based smart grid research by adopting complex network-based analytical approach. The study covers all scientific literature available online in Web of Science database. We are interested in identification of key papers, authors, and journals. Furthermore, we also investigate core countries, institutions, and categories.   </p

    A predictive resource allocation algorithm in the LTE uplink for event based M2M applications

    Get PDF
    Some M2M applications such as event monitoring involve a group of devices in a vicinity that act in a co-ordinated manner. An LTE network can exploit the correlated traffic characteristics of such devices by proactively assigning resources to devices based upon the activity of neighboring devices in the same group. This can reduce latency compared to waiting for each device in the group to request resources reactively per the standard LTE protocol. In this paper, we specify a new low complexity predictive resource allocation algorithm, known as the one way algorithm, for use with delay sensitive event based M2M applications in the LTE uplink. This algorithm requires minimal incremental processing power and memory resources at the eNodeB, yet can reduce the mean uplink latency below the minimum possible value for a non-predictive resource allocation algorithm. We develop mathematical models for the probability of a prediction, the probability of a successful prediction, the probability of an unsuccessful prediction, resource usage/wastage probabilities and mean uplink latency. The validity of these models is demonstrated by comparison with the results from a simulation. The models can be used offline by network operators or online in real time by the eNodeB scheduler to optimize performance

    Performance comparison of LTE FDD and TDD based Smart Grid communications networks for uplink biased traffic

    Full text link
    LTE is a candidate wide area communications network for the Smart Grid and can enable applications such as AMI, Demand Response and WAMS. We compare the uplink performance of the LTE FDD and TDD modes for a typical Smart Grid scenario involving a large number of devices sending small to medium size packets to understand the advantages and disadvantages of these two modes. An OPNET simulation model is employed to facilitate realistic comparisons based upon latency and channel utilization. We demonstrate that there is a critical packet size above which there is a step increase in uplink latency due to the nature of the LTE uplink resource scheduling process. It is shown that FDD leads to better uplink performance in terms of latency, while TDD can provide greater flexibility when the split between uplink and downlink data is asymmetrical (as it is expected to be in a Smart Grid environment). It is also demonstrated that the capacity of both FDD and TDD systems in terms of the number of serviced devices is control channel (PDCCH) limited for small infrequent packets, but TDD has the advantage that the capacity remains data channel (PUSCH) limited for smaller packet sizes and lower data burst rates than an FDD system. © 2012 IEEE

    Control Channel Interference Measurement in LTE-TDD Heterogeneous Network

    Get PDF
    Deploying low power eNodeBs inside macro-cells is an effective way to enhance indoor coverage. By reusing frequency between macro-cells and indoor femto-cells, the efficiency of expensive licensed spectrum can be further increased. This thesis measured Physical Downlink Control Channel (PDCCH) performance in such a heterogeneous LTE-TDD network. Four USRP software radio terminals and connected Linux workstations were deployed to build a test environment. They acted as eNodeB and UE respectively. During the test, the femto-cell was configured to coordinate its radio frame with the macro-cell. Several criteria including received block error rate, payload bit error rate and symbols signal to interference and noise ratio were used to evaluate the PDCCH performance in macro-cell under heterogeneous environment

    On the Feasibility of Utilizing Commercial 4G LTE Systems for Misson-Critical IoT Applications

    Full text link
    Emerging Internet of Things (IoT) applications and services including e-healthcare, intelligent transportation systems, smart grid, and smart homes to smart cities to smart workplace, are poised to become part of every aspect of our daily lives. The IoT will enable billions of sensors, actuators, and smart devices to be interconnected and managed remotely via the Internet. Cellular-based Machine-to-Machine (M2M) communications is one of the key IoT enabling technologies with huge market potential for cellular service providers deploying Long Term Evolution (LTE) networks. There is an emerging consensus that Fourth Generation (4G) and 5G cellular technologies will enable and support these applications, as they will provide the global mobile connectivity to the anticipated tens of billions of things/devices that will be attached to the Internet. Many vital utilities and service industries are considering the use of commercially available LTE cellular networks to provide critical connections to users, sensors, and smart M2M devices on their networks, due to its low cost and availability. Many of these emerging IoT applications are mission-critical with stringent requirements in terms of reliability and end-to-end (E2E) delay bound. The delay bound specified for each application refers to the device-to-device latencies, which is defined as the combined delay resulting from both application level processing time and communication latency. Each IoT application has its own distinct performance requirements in terms of latency, availability, and reliability. Typically, uplink (UL) traffic of most of these IoT applications is the dominant network traffic (much higher than total downlink (DL) traffic). Thus, efficient LTE UL scheduling algorithms at the base station (“Evolved NodeB (eNB)” per 3GPP standards) are more critical for M2M applications. LTE, however, was not originally intended for IoT applications, where traffic generated by M2M devices (running IoT applications) has totally different characteristics than those from traditional Human-to-Human (H2H)-based voice/video and data communications. In addition, due to the anticipated massive deployment of M2M devices and the limited available radio spectrum, the problem of efficient radio resources management (RRM) and UL scheduling poses a serious challenge in adopting LTE for M2M communications. Existing LTE quality of service (QoS) standard and UL scheduling algorithms were mainly optimized for H2H services and can’t accommodate such a wide range of diverging performance requirements of these M2M-based IoT applications. Though 4G LTE networks can support very low Packet Loss Ratio (PLR) at the physical layer, such reliability, however, comes at the expense of increased latency from tens to hundreds of ms due to the aggressive use of retransmission mechanisms. Current 4G LTE technologies may satisfy a single performance metric of these mission critical applications, but not the simultaneous support of ultra-high reliability and low latency as well as high data rates. Numerous QoS aware LTE UL scheduling algorithms for supporting M2M applications as well as H2H services have been reported in the literature. Most of these algorithms, however, were not intended for the support of mission critical IoT applications, as they are not latency-aware. In addition, these algorithms are simplified and don’t fully conform to LTE’s signaling and QoS standards. For instance, a common practice is the assumption that the time domain UL scheduler located at the eNB prioritizes user equipment (UEs)/M2M devices connection requests based on the head-of-line (HOL) packet waiting time at the UE/device transmission buffer. However, as will be detailed below, LTE standard does not support a mechanism that enables the UEs/devices to inform the eNB uplink scheduler about the waiting time of uplink packets residing in their transmission buffers. Ultra-Reliable Low-Latency Communication (URLLC) paradigm has recently emerged to enable a new range of mission-critical applications and services including industrial automation, real-time operation and control of the smart grid, inter-vehicular communications for improved safety and self-deriving vehicles. URLLC is one of the most innovative 5G New Radio (NR) features. URLLC and its supporting 5G NR technologies might become a commercial reality in the future, but it may be rather a distant future. Thus, deploying viable mission critical IoT applications will have to be postponed until URLLC and 5G NR technologies are commercially feasible. Because IoT applications, specifically mission critical, will have a significant impact on the welfare of all humanity, the immediate or near-term deployments of these applications is of utmost importance. It is the purpose of this thesis to explore whether current commercial 4G LTE cellular networks have the potential to support some of the emerging mission critical IoT applications. Smart grid is selected in this work as an illustrative IoT example because it is one of the most demanding IoT applications, as it includes diverse use cases ranging from mission-critical applications that have stringent requirements in terms of E2E latency and reliability to those that require support of massive number of connected M2M devices with relaxed latency and reliability requirements. The purpose of thesis is two fold: First, a user-friendly MATLAB-based open source software package to model commercial 4G LTE systems is developed. In contrast to mainstream commercial LTE software packages, the developed package is specifically tailored to accurately model mission critical IoT applications and above all fully conforms to commercial 4G LTE signaling and QoS standards. Second, utilizing the developed software package, we present a detailed realistic LTE UL performance analysis to assess the feasibility of commercial 4G LTE cellular networks when used to support such a diverse set of emerging IoT applications as well as typical H2H services

    Integrated Filters and Couplers for Next Generation Wireless Tranceivers

    Get PDF
    The main focus of this thesis is to investigate the critical nonlinear distortion issues affecting RF/Microwave components such as power amplifiers (PA) and develop new and improved solutions that will improve efficiency and linearity of next generation RF/Microwave mobile wireless communication systems. This research involves evaluating the nonlinear distortions in PA for different analog and digital signals which have been a major concern. The second harmonic injection technique is explored and used to effectively suppress nonlinear distortions. This method consists of simultaneously feeding back the second harmonics at the output of the power amplifier (PA) into the input of the PA. Simulated and measured results show improved linearity results. However, for increasing frequency bandwidth, the suppression abilities reduced which is a limitation for 4G LTE and 5G networks that require larger bandwidth (above 5 MHz). This thesis explores creative ways to deal with this major drawback. The injection technique was modified with the aid of a well-designed band-stop filter. The compact narrowband notch filter designed was able to suppress nonlinear distortions very effectively when used before the PA. The notch filter is also integrated in the injection technique for LTE carrier aggregation (CA) with multiple carriers and significant improvement in nonlinear distortion performance was observed. This thesis also considers maximizing efficiency alongside with improved linearity performance. To improve on the efficiency performance of the PA, the balanced PA configuration was investigated. However, another major challenge was that the couplers used in this configuration are very large in size at the desired operating frequency. In this thesis, this problem was solved by designing a compact branch line coupler. The novel coupler was simulated, fabricated and measured with performance comparable to its conventional equivalent and the coupler achieved substantial size reduction over others. The coupler is implemented in the balanced PA configuration giving improved input and output matching abilities. The proposed balanced PA is also implemented in 4G LTE and 5G wireless transmitters. This thesis provides simulation and measured results for all balanced PA cases with substantial efficiency and linearity improvements observed even for higher bandwidths (above 5 MHz). Additionally, the coupler is successfully integrated with rectifiers for improved energy harvesting performance and gave improved RF-dc conversion efficienc

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version
    corecore