669 research outputs found

    Key performance aspects of an LTE FDD based Smart Grid communications network

    Get PDF
    The Smart Grid will enable a new era of electricity generation, transmission, distribution and consumption driven by efficiency, reliability, flexibility and environmental concerns. A key component of the Smart Grid is a communications infrastructure for data acquisition, monitoring, control and protection. In this paper, we evaluate the key performance aspects of an LTE Release 8 FDD network as the wide area communications network for Smart Grid applications. We develop analytical results for latency and channel utilization and discuss the implications for Smart Grid traffic sources, particularly the fact that system capacity is likely to be control channel limited. We also develop an OPNET based discrete event simulation model for a PMU based fault monitoring system using LTE FDD as its communication medium and use it to validate the analytical findings. In particular, we demonstrate how uplink data plane latencies of less than 10ms can only be achieved using small application layer packets. These findings can be used to understand how to best deploy an LTE FDD network in a Smart Grid environment and also in the development of new radio resource management algorithms that are tailored specifically to Smart Grid traffic sources

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    A predictive resource allocation algorithm in the LTE uplink for event based M2M applications

    Get PDF
    Some M2M applications such as event monitoring involve a group of devices in a vicinity that act in a co-ordinated manner. An LTE network can exploit the correlated traffic characteristics of such devices by proactively assigning resources to devices based upon the activity of neighboring devices in the same group. This can reduce latency compared to waiting for each device in the group to request resources reactively per the standard LTE protocol. In this paper, we specify a new low complexity predictive resource allocation algorithm, known as the one way algorithm, for use with delay sensitive event based M2M applications in the LTE uplink. This algorithm requires minimal incremental processing power and memory resources at the eNodeB, yet can reduce the mean uplink latency below the minimum possible value for a non-predictive resource allocation algorithm. We develop mathematical models for the probability of a prediction, the probability of a successful prediction, the probability of an unsuccessful prediction, resource usage/wastage probabilities and mean uplink latency. The validity of these models is demonstrated by comparison with the results from a simulation. The models can be used offline by network operators or online in real time by the eNodeB scheduler to optimize performance

    Massive MIMO for Internet of Things (IoT) Connectivity

    Full text link
    Massive MIMO is considered to be one of the key technologies in the emerging 5G systems, but also a concept applicable to other wireless systems. Exploiting the large number of degrees of freedom (DoFs) of massive MIMO essential for achieving high spectral efficiency, high data rates and extreme spatial multiplexing of densely distributed users. On the one hand, the benefits of applying massive MIMO for broadband communication are well known and there has been a large body of research on designing communication schemes to support high rates. On the other hand, using massive MIMO for Internet-of-Things (IoT) is still a developing topic, as IoT connectivity has requirements and constraints that are significantly different from the broadband connections. In this paper we investigate the applicability of massive MIMO to IoT connectivity. Specifically, we treat the two generic types of IoT connections envisioned in 5G: massive machine-type communication (mMTC) and ultra-reliable low-latency communication (URLLC). This paper fills this important gap by identifying the opportunities and challenges in exploiting massive MIMO for IoT connectivity. We provide insights into the trade-offs that emerge when massive MIMO is applied to mMTC or URLLC and present a number of suitable communication schemes. The discussion continues to the questions of network slicing of the wireless resources and the use of massive MIMO to simultaneously support IoT connections with very heterogeneous requirements. The main conclusion is that massive MIMO can bring benefits to the scenarios with IoT connectivity, but it requires tight integration of the physical-layer techniques with the protocol design.Comment: Submitted for publicatio

    Performance Evaluation of Communication Technologies and Network Structure for Smart Grid Applications

    Get PDF
    The design of an effective and reliable communication network supporting smart grid applications requires the selection of appropriate communication technologies and protocols. The objective of this study is to study and quantify the capabilities of an advanced metring infrastructure (AMI) to support the simultaneous operation of major smart grid functions. These include smart metring, price-induced controls, distribution automation, demand response, and electric vehicle charging/discharging applications in terms of throughput and latency. OPNET is used to simulate the performance of selected communication technologies and protocols. Research findings indicate that smart grid applications can operate simultaneously by piggybacking on an existing AMI infrastructure and still achieve their latency requirements

    The history of WiMAX: a complete survey of the evolution in certification and standarization for IEEE 802.16 and WiMAX

    Get PDF
    Most researchers are familiar with the technical features of WiMAX technology but the evolution that WiMAX went through, in terms of standardization and certification, is missing and unknown to most people. Knowledge of this historical process would however aid to understand how WiMAX has become the widespread technology that it is today. Furthermore, it would give insight in the steps to undertake for anyone aiming at introducing a new wireless technology on a worldwide scale. Therefore, this article presents a survey on all relevant activities that took place within three important organizations: the 802.16 Working Group of the IEEE (Institute of Electrical and Electronics Engineers) for technology development and standardization, the WiMAX Forum for product certification and the ITU (International Telecommunication Union) for international recognition. An elaborated and comprehensive overview of all those activities is given, which reveals the importance of the willingness to innovate and to continuously incorporate new ideas in the IEEE standardization process and the importance of the WiMAX Forum certification label granting process to ensure interoperability. We also emphasize the steps that were taken in cooperating with the ITU to improve the international esteem of the technology. Finally, a WiMAX trend analysis is made. We showed how industry interest has fluctuated over time and quantified the evolution in WiMAX product certification and deployments. It is shown that most interest went to the 2.5 GHz and 3.5GHz frequencies, that most deployments are in geographic regions with a lot of developing countries and that the highest people coverage is achieved in Asia Pacific. This elaborated description of all standardization and certification activities, from the very start up to now, will make the reader comprehend how past and future steps are taken in the development process of new WiMAX features
    • …
    corecore