716 research outputs found

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    2D Time-frequency interference modelling using stochastic geometry for performance evaluation in Low-Power Wide-Area Networks

    Full text link
    In wireless networks, interferences between trans- missions are modelled either in time or frequency domain. In this article, we jointly analyze interferences in the time- frequency domain using a stochastic geometry model assuming the total time-frequency resources to be a two-dimensional plane and transmissions from Internet of Things (IoT) devices time- frequency patterns on this plane. To evaluate the interference, we quantify the overlap between the information packets: provided that the overlap is not too strong, the packets are not necessarily lost due to capture effect. This flexible model can be used for multiple medium access scenarios and is especially adapted to the random time-frequency access schemes used in Low-Power Wide-Area Networks (LPWANs). By characterizing the outage probability and throughput, our approach permits to evaluate the performance of two representative LPWA technologies Sigfox{\textsuperscript \textregistered} and LoRaWA{\textsuperscript \textregistered}

    A Multi-Service Oriented Multiple-Access Scheme for Next-Generation Mobile Networks

    Full text link
    One of the key requirements for fifth-generation (5G) cellular networks is their ability to handle densely connected devices with different quality of service (QoS) requirements. In this article, we present multi-service oriented multiple access (MOMA), an integrated access scheme for massive connections with diverse QoS profiles and/or traffic patterns originating from both handheld devices and machine-to-machine (M2M) transmissions. MOMA is based on a) stablishing separate classes of users based on relevant criteria that go beyond the simple handheld/M2M split, b) class dependent hierarchical spreading of the data signal and c) a mix of multiuser and single-user detection schemes at the receiver. Practical implementations of the MOMA principle are provided for base stations (BSs) that are equipped with a large number of antenna elements. Finally, it is shown that such a massive-multiple-input-multiple-output (MIMO) scenario enables the achievement of all the benefits of MOMA even with a simple receiver structure that allows to concentrate the receiver complexity where effectively needed.Comment: 6 pages, 3 figures, accepted to the European Conference on Networks and Communications (EuCNC 2016

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Kapeankaistan LTE koneiden välisessä satelliittitietoliikenteessä

    Get PDF
    Recent trends to wireless Machine-to-Machine (M2M) communication and Internet of Things (IoT) has created a new demand for more efficient low-throughput wireless data connections. Beside the traditional wireless standards, focused on high bandwidth data transfer, has emerged a new generation of Low Power Wide Area Networks (LPWAN) which targets for less power demanding low-throughput devices requiring inexpensive data connections. Recently released NB-IoT (Narrowband IoT) specification extends the existing 4G/LTE standard allowing more easily accessible LPWAN cellular connectivity for IoT devices. Narrower bandwidth and lower data rates combined to a simplified air interface make it less resource demanding still benefiting from the widely spread LTE technologies and infrastructure. %% Applications & Why space Applications, such as wide scale sensor or asset tracking networks, can benefit from a global scale network coverage and easily available low-cost user equipment which could be made possible by new narrowband IoT satellite networks. In this thesis, the NB-IoT specification and its applicability for satellite communication is discussed. Primarily, LTE and NB-IoT standards are designed only for terrestrial and their utilization in Earth-to-space communication raises new challenges, such as timing and frequency synchronization requirements when utilizing Orthogonal Frequency Signal Multiplexing (OFDM) techniques. Many of these challenges can be overcome by specification adaptations and other existing techniques making minimal changes to the standard and allowing extension of the terrestrial cellular networks to global satellite access.Viimeaikaiset kehitystrendit koneiden välisessä kommunikaatiossa (Machine to Machine Communication, M2M) ja esineiden Internet (Internet of Things, IoT) -sovelluksissa ovat luoneet perinteisteisten nopean tiedonsiirron langattomien standardien ohelle uuden sukupolven LPWAN (Low Power Wide Area Networks) -tekniikoita, jotka ovat tarkoitettu pienitehoisille tiedonsiirtoa tarvitseville sovelluksille. Viimeaikoina yleistynyt NB-IoT standardi laajentaa 4G/LTE standardia mahdollistaen entistä matalamman virrankulutuksen matkapuhelinyhteydet IoT laitteissa. Kapeampi lähetyskaista ja hitaampi tiedonsiirtonopeus yhdistettynä yksinkertaisempaan ilmarajapintaan mahdollistaa pienemmän resurssivaatimukset saman aikaan hyötyen laajalti levinneistä LTE teknologioista ja olemassa olevasta infrastruktuurista. Useissa sovelluskohteissa, kuten suurissa sensoriverkoissa, voitaisiin hyötyä merkittävästi globaalista kattavuudesta yhdistettynä edullisiin helposti saataviin päätelaitteisiin. Tässä työssä käsitellään NB-IoT standardia ja sen soveltuvuutta satellittitietoliikenteeseen. LTE ja NB-IoT ovat kehitty maanpääliseen tietoliikenteeseen ja niiden hyödyntäminen avaruuden ja maan välisessä kommunikaatiossa aiheuttaa uusia haasteita esimerkiksi aika- ja taajuussynkronisaatiossa ja OFDM (Orthogonal Frequency Signal Multiplexing) -tekniikan hyödyntämisessä. Nämä haasteet voidaan ratkaista soveltamalla spesifikaatiota sekä muilla jo olemassa olevilla tekniikoilla tehden mahdollisimman vähän muutoksia alkuperäiseen standardiin, ja täten sallien maanpäälisten IoT verkkojen laajenemisen avaruuteen

    Narrowband Interference Suppression in Wireless OFDM Systems

    Full text link
    Signal distortions in communication systems occur between the transmitter and the receiver; these distortions normally cause bit errors at the receiver. In addition interference by other signals may add to the deterioration in performance of the communication link. In order to achieve reliable communication, the effects of the communication channel distortion and interfering signals must be reduced using different techniques. The aim of this paper is to introduce the fundamentals of Orthogonal Frequency Division Multiplexing (OFDM) and Orthogonal Frequency Division Multiple Access (OFDMA), to review and examine the effects of interference in a digital data communication link and to explore methods for mitigating or compensating for these effects
    corecore