6 research outputs found

    Collision-Free Transmissions in an IoT Monitoring Application Based on LoRaWAN

    Get PDF
    International audienceWith the Internet of Things (IoT), the number of monitoring applications deployed is considerably increasing, whatever the field considered: smart city, smart agriculture, environment monitoring, air pollution monitoring, to name a few. The LoRaWAN (Long Range Wide Area Network)architecture with its long range communication, its robustness to interference and its reduced energy consumption is an excellent candidate to support such applications. However, if the number of end devices is high, the reliability of LoRaWAN, measured by the Packet Delivery Ratio (PDR), becomes unacceptable due to an excessive number of collisions. In this paper, we propose two different families of solutions ensuring collision-free transmissions. The first family is TDMA (Time-Division Multiple Access)-based. All clusters transmit in sequence and up to six end devices with different spreading factors belonging to the same cluster are allowed to transmit in parallel. The second family is FDMA (Frequency Divsion Multiple Access)-based. All clusters transmit in parallel, each cluster on its own frequency. Within each cluster, all end devices transmit in sequence. Their performance are compared in terms of PDR, energy consumption by end device and maximum number of end devices supported. Simulation results corroborate the theoretical results and show the high efficiency of the solutions proposed

    Machine Learning Aided Orthogonal Resource Allocation In Heterogeneous Low Power Wide Area Networks

    Get PDF
     近年,IoTの発展に伴い,長距離通信・超低消費電力・同時多接続などの要件が重要視されている.これらの要求に対応するため,LoRaWAN(Long Range Wide Area Network)に代表されるLPWAN(Low Power Wide Area Network)通信規格に注目が集まっている.これらの規格では,無線ノードの低消費電力化のために各レイヤにおいて単純な機能が用いられている.例えば,MAC層では,集中制御ではなく各無線ノードが自律分散的にランダムアクセスを行うことで周波数リソースの共用を行う多元接続方式が採用されている.このような単純な通信制御では,無線ノード数の増加に伴いパケット衝突が頻発することが大きな問題である.この問題の解決策としては,無線資源を有効に活用できるようなリソース割り当てが挙げられる.しかしながら,LoRaWAN環境においては,拡散係数 (SF: Spreading Factor)と呼ばれる物理層変調パラメータの割り当てしか行われておらず,システムで利用可能な複数の周波数リソースに関しては,各端末がパケット送信時にランダムに周波数を移動するランダムホッピングが適用されている.更に,既存研究は明示的なフィードバックやチャネル推定に基づくものが大半であり,これらの方式はリソース割り当てのために余剰なオーバーヘッドを必要とする.また,920[MHz]帯での利用を想定した場合には,一つの帯域を特定のシステムが専有することは現実的ではなく,複数のシステムが周波数を共用することになる.例えば,使用帯域が重なるLoRaWANシステムとWi-SUNシステム間での相互干渉の影響が挙げられる.このように,実際には他システムの影響を考慮したリソース割り当てが重要である. 本論文では,衝突機能回避付きキャリアセンス多元接続(CSMA/CA: Carrier Sense Multiple Access/Collision Avoidance)をMAC層の多元接続方式として用いるLoRaWAN環境において,強化学習に基づく効率的な周波数リソース割り当て方式を提案する.本提案方式では,各LoRaWANノードから正しく受信できたパケット数という,LoRaWANの情報集約局(FC: Fusion Center)で観測できる情報のみを用いて強化学習を行うことで,LoRaWANノードからの明示的なフィードバックやチャネル推定などの処理を必要とせずに効率的な周波数リソースの割り当てを行うことが可能である.また,これに加えて,密度比推定に基づく分布変化検知を利用した外部からのシステム間干渉の変動に追従できるような干渉検知方式および無線リソース再割当て方式を提案する.計算機シミュレーションにより,提案手法は従来の手法と比較して平均で13%程度パケット配信率(PDR: Packet Delivery Rate)を向上でき,さらに最大で3回程度の観測で外部干渉の状態変化を検知でき,干渉検知を行わない場合と比較して最大で平均10%程度PDRを向上できることを示す.電気通信大学201

    Performance Evaluation of Class A LoRa Communications

    Get PDF
    Recently, Low Power Wide Area Networks (LPWANs) have attracted a great interest due to the need of connecting more and more devices to the so-called Internet of Things (IoT). This thesis explores LoRa’s suitability and performance within this paradigm, through a theoretical approach as well as through practical data acquired in multiple field campaigns. First, a performance evaluation model of LoRa class A devices is proposed. The model is meant to characterize the performance of LoRa’s Uplink communications where both physical layer (PHY) and medium access control (MAC) are taken into account. By admitting a uniform spatial distribution of the devices, the performance characterization of the PHY-layer is studied through the derivation of the probability of successfully decoding multiple frames that were transmitted with the same spreading factor and at the same time. The MAC performance is evaluated by admitting that the inter-arrival time of the frames generated by each LoRa device is exponentially distributed. A typical LoRaWAN operating scenario is considered, where the transmissions of LoRa Class A devices suffer path-loss, shadowing and Rayleigh fading. Numerical results obtained with the modeling methodology are compared with simulation results, and the validation of the proposed model is discussed for different levels of traffic load and PHY-layer conditions. Due to the possibility of capturing multiple frames simultaneously, the maximum achievable performance of the PHY/MAC LoRa scheme according to the signal-to-interference-plus-noise ratio (SINR) is considered. The contribution of this model is primarily focused on studying the average number of successfully received LoRa frames, which establishes a performance upper bound due to the optimal capture condition considered in the PHY-layer. In the second stage of this work a practical LoRa point-to-point network was deployed to characterize LoRa’s performance in a practical way. Performance was assessed through data collected in the course of several experiments, positioning the transmitter in diverse locations and environments. This work reports statistics of the received packets and different metrics gathered from the physical-layer

    Low-Power Wide-Area Networks: A Broad Overview of its Different Aspects

    Get PDF
    Low-power wide-area networks (LPWANs) are gaining popularity in the research community due to their low power consumption, low cost, and wide geographical coverage. LPWAN technologies complement and outperform short-range and traditional cellular wireless technologies in a variety of applications, including smart city development, machine-to-machine (M2M) communications, healthcare, intelligent transportation, industrial applications, climate-smart agriculture, and asset tracking. This review paper discusses the design objectives and the methodologies used by LPWAN to provide extensive coverage for low-power devices. We also explore how the presented LPWAN architecture employs various topologies such as star and mesh. We examine many current and emerging LPWAN technologies, as well as their system architectures and standards, and evaluate their ability to meet each design objective. In addition, the possible coexistence of LPWAN with other technologies, combining the best attributes to provide an optimum solution is also explored and reported in the current overview. Following that, a comparison of various LPWAN technologies is performed and their market opportunities are also investigated. Furthermore, an analysis of various LPWAN use cases is performed, highlighting their benefits and drawbacks. This aids in the selection of the best LPWAN technology for various applications. Before concluding the work, the open research issues, and challenges in designing LPWAN are presented.publishedVersio

    Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques

    Get PDF
    The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application
    corecore