193 research outputs found

    LoRa Transmission Parameter Selection

    Get PDF
    Low-Power Wide-Area Network (LPWAN) technologies such as Long Range (LoRa) are emerging that enable power efficient wireless communication over very long distances. LPWAN devices typically communicate directly to a sink node which removes the need of constructing and maintaining a complex multi-hop network. However, to ensure efficient and reliable communication LPWAN devices often provide a large number of transmission parameters. For example, a LoRa device can be configured to use different spreading factors, bandwidth settings, coding rates and transmission powers, resulting in over 6720 possible settings. It is a challenge to determine the setting that minimises transmission energy cost while meeting the required communication performance. This paper is the first to present a thorough analysis of the impact of LoRa transmission parameter selection on communication performance. We study in detail the impact of parameter settings on energy consumption and communication reliability. Using this study we develop a link probing regime which enables us to quickly determine transmission settings that satisfy performance requirements. The presented work is a first step towards an automated mechanism for LoRa transmission parameter selection that a deployed LoRa network requires, but is not yet specified within the Long Range Wide Area Network (LoRaWAN) framework

    Towards the efficient use of LoRa for wireless sensor networks

    Get PDF
    Since their inception in 1998 with the Smart Dust Project from University of Berkeley, Wireless Sensor Networks (WSNs) had a tremendous impact on both science and society, influencing many (new) research fields, like Cyber-physical System (CPS), Machine to Machine (M2M), and Internet of Things (IoT). In over two decades, WSN researchers have delivered a wide-range of hardware, communication protocols, operating systems, and applications, to deal with the now classic problems of resourceconstrained devices, limited energy sources, and harsh communication environments. However, WSN research happened mostly on the same kind of hardware. With wireless communication and embedded hardware evolving, there are new opportunities to resolve the long standing issues of scaling, deploying, and maintaining a WSN. To this end, we explore in this work the most recent advances in low-power, longrange wireless communication, and the new challenges these new wireless communication techniques introduce. Specifically, we focus on the most promising such technology: LoRa. LoRa is a novel low-power, long-range communication technology, which promises a single-hop network with millions of sensor nodes. Using practical experiments, we evaluate the unique properties of LoRa, like orthogonal spreading factors, nondestructive concurrent transmissions, and carrier activity detection. Utilising these unique properties, we build a novel TDMA-style multi-hop Medium Access Control (MAC) protocol called LoRaBlink. Based on empirical results, we develop a communication model and simulator called LoRaSim to explore the scalability of a LoRa network. We conclude that, in its current deployment, LoRa cannot support the scale it is envisioned to operate at. One way to improve this scalability issue is Adaptive Data Rate (ADR). We develop two ADR protocols, Probing and Optimistic Probing, and compare them with the de facto standard ADR protocol used in the crowdsourced TTN LoRaWAN network. We demonstrate that our algorithms are much more responsive, energy efficient, and able to reach a more efficient configuration quicker, though reaching a suboptimal configuration for poor links, which is offset by the savings caused by the convergence speed. Overall, this work provides theoretical and empirical proofs that LoRa can tackle some of the long standing problems within WSN. We envision that future work, in particular on ADR and MAC protocols for LoRa and other low-power, long-range communication technologies, will help push these new communication technologies to main-stream status in WSNs

    Predicting lorawan behavior. How machine learning can help

    Get PDF
    Large scale deployments of Internet of Things (IoT) networks are becoming reality. From a technology perspective, a lot of information related to device parameters, channel states, network and application data are stored in databases and can be used for an extensive analysis to improve the functionality of IoT systems in terms of network performance and user services. LoRaWAN (Long Range Wide Area Network) is one of the emerging IoT technologies, with a simple protocol based on LoRa modulation. In this work, we discuss how machine learning approaches can be used to improve network performance (and if and how they can help). To this aim, we describe a methodology to process LoRaWAN packets and apply a machine learning pipeline to: (i) perform device profiling, and (ii) predict the inter-arrival of IoT packets. This latter analysis is very related to the channel and network usage and can be leveraged in the future for system performance enhancements. Our analysis mainly focuses on the use of k-means, Long Short-Term Memory Neural Networks and Decision Trees. We test these approaches on a real large-scale LoRaWAN network where the overall captured traffic is stored in a proprietary database. Our study shows how profiling techniques enable a machine learning prediction algorithm even when training is not possible because of high error rates perceived by some devices. In this challenging case, the prediction of the inter-arrival time of packets has an error of about 3.5% for 77% of real sequence cases

    Predicting lorawan behavior. How machine learning can help

    Get PDF
    Large scale deployments of Internet of Things (IoT) networks are becoming reality. From a technology perspective, a lot of information related to device parameters, channel states, network and application data are stored in databases and can be used for an extensive analysis to improve the functionality of IoT systems in terms of network performance and user services. LoRaWAN (Long Range Wide Area Network) is one of the emerging IoT technologies, with a simple protocol based on LoRa modulation. In this work, we discuss how machine learning approaches can be used to improve network performance (and if and how they can help). To this aim, we describe a methodology to process LoRaWAN packets and apply a machine learning pipeline to: (i) perform device profiling, and (ii) predict the inter-arrival of IoT packets. This latter analysis is very related to the channel and network usage and can be leveraged in the future for system performance enhancements. Our analysis mainly focuses on the use of k-means, Long Short-Term Memory Neural Networks and Decision Trees. We test these approaches on a real large-scale LoRaWAN network where the overall captured traffic is stored in a proprietary database. Our study shows how profiling techniques enable a machine learning prediction algorithm even when training is not possible because of high error rates perceived by some devices. In this challenging case, the prediction of the inter-arrival time of packets has an error of about 3.5% for 77% of real sequence cases

    The SF12 well in LoRaWAN: problem and end-device-based solutions

    Get PDF
    © 2021 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/)LoRaWAN has become a popular technology for the Internet of Things (IoT) device connectivity. One of the expected properties of LoRaWAN is high network scalability. However, LoRaWAN network performance may be compromised when even a relatively small number of devices use link-layer reliability. After failed frame delivery, such devices typically tend to reduce their physical layer bit rate by increasing their spreading factor (SF). This reaction increases channel utilization, which may further degrade network performance, even into congestion collapse. When this problem arises, all the devices performing reliable frame transmission end up using SF12 (i.e., the highest SF in LoRaWAN). In this paper, we identify and characterize the described network condition, which we call the SF12 Well, in a range of scenarios and by means of extensive simulations. The results show that by using alternative SF-management techniques it is possible to avoid the problem, while achieving a packet delivery ratio increase of up to a factor of 4.7.Postprint (published version

    ChirpOTLE: A Framework for Practical LoRaWAN Security Evaluation

    Full text link
    Low-power wide-area networks (LPWANs) are becoming an integral part of the Internet of Things. As a consequence, businesses, administration, and, subsequently, society itself depend on the reliability and availability of these communication networks. Released in 2015, LoRaWAN gained popularity and attracted the focus of security research, revealing a number of vulnerabilities. This lead to the revised LoRaWAN 1.1 specification in late 2017. Most of previous work focused on simulation and theoretical approaches. Interoperability and the variety of implementations complicate the risk assessment for a specific LoRaWAN network. In this paper, we address these issues by introducing ChirpOTLE, a LoRa and LoRaWAN security evaluation framework suitable for rapid iteration and testing of attacks in testbeds and assessing the security of real-world networks.We demonstrate the potential of our framework by verifying the applicability of a novel denial-of-service attack targeting the adaptive data rate mechanism in a testbed using common off-the-shelf hardware. Furthermore, we show the feasibility of the Class B beacon spoofing attack, which has not been demonstrated in practice before.Comment: 11 pages, 14 figures, accepted at ACM WiSec 2020 (13th ACM Conference on Security and Privacy in Wireless and Mobile Networks
    • 

    corecore