213 research outputs found

    Massive Non-Orthogonal Multiple Access for Cellular IoT: Potentials and Limitations

    Full text link
    The Internet of Things (IoT) promises ubiquitous connectivity of everything everywhere, which represents the biggest technology trend in the years to come. It is expected that by 2020 over 25 billion devices will be connected to cellular networks; far beyond the number of devices in current wireless networks. Machine-to-Machine (M2M) communications aims at providing the communication infrastructure for enabling IoT by facilitating the billions of multi-role devices to communicate with each other and with the underlying data transport infrastructure without, or with little, human intervention. Providing this infrastructure will require a dramatic shift from the current protocols mostly designed for human-to-human (H2H) applications. This article reviews recent 3GPP solutions for enabling massive cellular IoT and investigates the random access strategies for M2M communications, which shows that cellular networks must evolve to handle the new ways in which devices will connect and communicate with the system. A massive non-orthogonal multiple access (NOMA) technique is then presented as a promising solution to support a massive number of IoT devices in cellular networks, where we also identify its practical challenges and future research directions.Comment: To appear in IEEE Communications Magazin

    SNR Gain Evaluation in Narrowband IoT Uplink Data Transmission with Repetition Increment: A Simulation Approach

    Get PDF
    Deploying Internet of Things (IoT) on a large scale necessitates widespread network infrastructures supporting Machine Type Communication. Integrating IoT into cellular networks like LTE, known as Narrowband-IoT (NB-IoT), can fulfill this infrastructure need. Standard 3GPP Release 13 introduces NB-IoT's Repetition features, expanding radio transmission coverage while maintaining LTE performance. Focusing on uplink data traffic, this study examines NB-IoT's repetition mechanism, grid resource distribution, and NPUSCH performance through simulations. Results show that at SNR greater than -5 dB, maximum repetitions of 128 yield the highest BLER, while minimum repetitions of 2 result in the lowest. Quadrupling repetitions increases SNR by 5 dB, emphasizing repetition's role in error mitigation and uplink reliability, especially in challenging SNR conditions. For optimal throughput in SNR above -5 dB, maximum repetitions of 128 for NPUSCH format 1 are recommended. These findings underscore the importance of repetition in enhancing Narrowband IoT performance, offering insights for system optimization, where increasing the number of repetitions generally leads to higher SNR gain. The attained BLER and throughput values from Narrowband IoT simulations highlight the robustness of data transmission across varying channel conditions, affirming NB-IoT applicability to a wide range of IoT applications

    Improving Energy Efficiency for IoT Communications in 5G Networks

    Get PDF
    Increase in number of Internet of Things (IoT) devices is quickly changing how mobile networks are being used by shifting more usage to uplink transmissions rather than downlink transmissions. Currently, mobile network uplinks utilize Single Carrier Frequency Division Multiple Access (SC-FDMA) schemes due to the low Peak to Average Power Ratio (PAPR) when compared to Orthogonal Frequency Division Multiple Access (OFDMA). In an IoT perspective, power ratios are highly important in effective battery usage since devices are typically resource-constrained. Fifth Generation (5G) mobile networks are believed to be the future standard network that will handle the influx of IoT device uplinks while preserving the quality of service (QoS) that current Long Term Evolution Advanced (LTE-A) networks provide. In this paper, the Enhanced OEA algorithm was proposed and simulations showed a reduction in the device energy consumption and an increase in the power efficiency of uplink transmissions while preserving the QoS rate provided with SC-FDMA in 5G networks. Furthermore, the computational complexity was reduced through insertion of a sorting step prior to resource allocation

    Non-Orthogonal Narrowband Internet of Things: A Design for Saving Bandwidth and Doubling the Number of Connected Devices

    Get PDF
    IEEE Narrowband IoT (NB-IoT) is a low power wide area network (LPWAN) technique introduced in 3GPP release 13. The narrowband transmission scheme enables high capacity, wide coverage and low power consumption communications. With the increasing demand for services over the air, wireless spectrum is becoming scarce and new techniques are required to boost the number of connected devices within a limited spectral resource to meet the service requirements. This work provides a compressed signal waveform solution, termed fast-orthogonal frequency division multiplexing (Fast-OFDM), to double potentially the number of connected devices by compressing occupied bandwidth of each device without compromising data rate and bit error rate (BER) performance. Simulation is firstly evaluated for the Fast-OFDM with comparisons to single-carrier-frequency division multiple access (SC-FDMA). Results indicate the same performance for both systems in additive white Gaussian noise (AWGN) channel. Experimental measurements are also presented to show the bandwidth saving benefits of Fast-OFDM. It is shown that in a line-of-sight (LOS) scenario, Fast-OFDM has similar performance as SC-FDMA but with 50% bandwidth saving. This research paves the way for extended coverage, enhanced capacity and improved data rate of NB-IoT in 5th generation (5G) new radio (NR) networks

    On the Fundamental Limits of Random Non-orthogonal Multiple Access in Cellular Massive IoT

    Get PDF
    Machine-to-machine (M2M) constitutes the communication paradigm at the basis of Internet of Things (IoT) vision. M2M solutions allow billions of multi-role devices to communicate with each other or with the underlying data transport infrastructure without, or with minimal, human intervention. Current solutions for wireless transmissions originally designed for human-based applications thus require a substantial shift to cope with the capacity issues in managing a huge amount of M2M devices. In this paper, we consider the multiple access techniques as promising solutions to support a large number of devices in cellular systems with limited radio resources. We focus on non-orthogonal multiple access (NOMA) where, with the aim to increase the channel efficiency, the devices share the same radio resources for their data transmission. This has been shown to provide optimal throughput from an information theoretic point of view.We consider a realistic system model and characterise the system performance in terms of throughput and energy efficiency in a NOMA scenario with a random packet arrival model, where we also derive the stability condition for the system to guarantee the performance.Comment: To appear in IEEE JSAC Special Issue on Non-Orthogonal Multiple Access for 5G System

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    UE Uplink Power Distribution for M2M over LTE

    Get PDF

    Clock Error Impact on NB-IoT Radio Link Performance

    Get PDF
    3GPP has recently addressed the improvements in Random Access Network (RAN) and specified some new technologies such as enhanced Machine Type Communication (eMTC) and Narrow Band – Internet of Things (NB-IoT) in its release 13 which is also known as LTE-Advanced Pro. These new technologies are addressed mainly to focus on development and deployment of cellular IoT services. NB-IoT is less complex and easily deployable through software upgradation and is compatible to legacy cellular networks such as GSM and 4G which makes it a suitable candidate for IoT. NB-IoT will greatly support LPWAN, thus, it can be deployed for Smart cities and other fields such as smart electricity, smart agriculture, smart health services and smart homes. The NB-IoT targets for low cost device, low power consumption, relaxed delay sensitivity and easy deployment which will greatly support above mentioned fields. This thesis work studies the clock error impact on the radio link performance for up-link transmission on the NB-IoT testbed based on Cloud-RAN using Software Defined Radios (SDR) on a LTE protocol stack. The external clock error is introduced to the network and performance issues are analyzed in the radio link. The analysis indicates packet drops up to 51% in the radio link through the study of received power, packet loss, retransmissions, BLER and SINR for different MCS index. The major performance issues depicted by the analysis are packet loss up to 51% and retransmission of packets up to 128 times for lower SINR and high clock errors. Also, clock errors produce CFO up to 1.25 ppm which results in bad synchronization between UE and eNodeB

    On the Feasibility of Utilizing Commercial 4G LTE Systems for Misson-Critical IoT Applications

    Full text link
    Emerging Internet of Things (IoT) applications and services including e-healthcare, intelligent transportation systems, smart grid, and smart homes to smart cities to smart workplace, are poised to become part of every aspect of our daily lives. The IoT will enable billions of sensors, actuators, and smart devices to be interconnected and managed remotely via the Internet. Cellular-based Machine-to-Machine (M2M) communications is one of the key IoT enabling technologies with huge market potential for cellular service providers deploying Long Term Evolution (LTE) networks. There is an emerging consensus that Fourth Generation (4G) and 5G cellular technologies will enable and support these applications, as they will provide the global mobile connectivity to the anticipated tens of billions of things/devices that will be attached to the Internet. Many vital utilities and service industries are considering the use of commercially available LTE cellular networks to provide critical connections to users, sensors, and smart M2M devices on their networks, due to its low cost and availability. Many of these emerging IoT applications are mission-critical with stringent requirements in terms of reliability and end-to-end (E2E) delay bound. The delay bound specified for each application refers to the device-to-device latencies, which is defined as the combined delay resulting from both application level processing time and communication latency. Each IoT application has its own distinct performance requirements in terms of latency, availability, and reliability. Typically, uplink (UL) traffic of most of these IoT applications is the dominant network traffic (much higher than total downlink (DL) traffic). Thus, efficient LTE UL scheduling algorithms at the base station (“Evolved NodeB (eNB)” per 3GPP standards) are more critical for M2M applications. LTE, however, was not originally intended for IoT applications, where traffic generated by M2M devices (running IoT applications) has totally different characteristics than those from traditional Human-to-Human (H2H)-based voice/video and data communications. In addition, due to the anticipated massive deployment of M2M devices and the limited available radio spectrum, the problem of efficient radio resources management (RRM) and UL scheduling poses a serious challenge in adopting LTE for M2M communications. Existing LTE quality of service (QoS) standard and UL scheduling algorithms were mainly optimized for H2H services and can’t accommodate such a wide range of diverging performance requirements of these M2M-based IoT applications. Though 4G LTE networks can support very low Packet Loss Ratio (PLR) at the physical layer, such reliability, however, comes at the expense of increased latency from tens to hundreds of ms due to the aggressive use of retransmission mechanisms. Current 4G LTE technologies may satisfy a single performance metric of these mission critical applications, but not the simultaneous support of ultra-high reliability and low latency as well as high data rates. Numerous QoS aware LTE UL scheduling algorithms for supporting M2M applications as well as H2H services have been reported in the literature. Most of these algorithms, however, were not intended for the support of mission critical IoT applications, as they are not latency-aware. In addition, these algorithms are simplified and don’t fully conform to LTE’s signaling and QoS standards. For instance, a common practice is the assumption that the time domain UL scheduler located at the eNB prioritizes user equipment (UEs)/M2M devices connection requests based on the head-of-line (HOL) packet waiting time at the UE/device transmission buffer. However, as will be detailed below, LTE standard does not support a mechanism that enables the UEs/devices to inform the eNB uplink scheduler about the waiting time of uplink packets residing in their transmission buffers. Ultra-Reliable Low-Latency Communication (URLLC) paradigm has recently emerged to enable a new range of mission-critical applications and services including industrial automation, real-time operation and control of the smart grid, inter-vehicular communications for improved safety and self-deriving vehicles. URLLC is one of the most innovative 5G New Radio (NR) features. URLLC and its supporting 5G NR technologies might become a commercial reality in the future, but it may be rather a distant future. Thus, deploying viable mission critical IoT applications will have to be postponed until URLLC and 5G NR technologies are commercially feasible. Because IoT applications, specifically mission critical, will have a significant impact on the welfare of all humanity, the immediate or near-term deployments of these applications is of utmost importance. It is the purpose of this thesis to explore whether current commercial 4G LTE cellular networks have the potential to support some of the emerging mission critical IoT applications. Smart grid is selected in this work as an illustrative IoT example because it is one of the most demanding IoT applications, as it includes diverse use cases ranging from mission-critical applications that have stringent requirements in terms of E2E latency and reliability to those that require support of massive number of connected M2M devices with relaxed latency and reliability requirements. The purpose of thesis is two fold: First, a user-friendly MATLAB-based open source software package to model commercial 4G LTE systems is developed. In contrast to mainstream commercial LTE software packages, the developed package is specifically tailored to accurately model mission critical IoT applications and above all fully conforms to commercial 4G LTE signaling and QoS standards. Second, utilizing the developed software package, we present a detailed realistic LTE UL performance analysis to assess the feasibility of commercial 4G LTE cellular networks when used to support such a diverse set of emerging IoT applications as well as typical H2H services
    • …
    corecore