12 research outputs found

    Optimizing Power Allocation in LoRaWAN IoT Applications

    No full text

    Optimising Power Allocation in LoRaWAN IoT Applications

    Get PDF
    Long Range Wide Area Network (LoRaWAN) is one of the most promising IoT technologies that are widely adopted in low-power wide-area networks (LPWAN). LoRaWAN faces scalability issues due to a large number of nodes connected to the same gateway and sharing the same channel. Therefore, LoRa networks seek to achieve two main objectives: successful delivery rate and efficient energy consumption. This paper proposes a novel game theoretic framework for LoRaWAN named Best Equal LoRa (BE-LoRa), to jointly optimize the packet delivery ratio and the energy efficiency (bit/Joule). The utility function of LoRa node is defined as the ratio of the throughput to the transmit power. LoRa nodes act as rational users (players) which seek to maximize their utility. The aim of the BE-LoRa algorithm is to maximize the utility of LoRa nodes while maintaining the same signal-to-interference-and-noise-ratio (SINR) for each SF. The power allocation algorithm is implemented at the network server, which leads to an optimum SINR, spreading factors (SFs) and transmission power settings of all nodes. Numerical and simulation results show that the proposed BE-LoRa power allocation algorithm has a significant improvement in packet delivery ratio and energy efficiency as compared to the Adaptive Data Rate (ADR) algorithm of legacy LoRaWAN. For instance, in very dense networks (624 nodes), BE-LoRa can improve the delivery ratio by 17.44% and reduce power consumed by 46% compared with LoRaWAN ADR

    Optimizing Power Allocation in LoRaWAN IoT Applications

    No full text
    corecore