4,840 research outputs found

    A Platform for Large-Scale Regional IoT Networks

    Get PDF
    The Internet of Things (IoT) promises to allow everyday objects to connect to the Internet and interact with users and other machines ubiquitously. Central to this vision is a pervasive wireless communication network connecting each end device. For individual IoT applications it is costly to deploy a dedicated network or connect to an existing cellular network, especially as these applications do not fully utilize the bandwidth provided by modern high speeds networks (e.g., WiFi, 4G LTE). On the other hand, decades of wireless research have produced numerous low-cost chip radios and effective networking stacks designed for short-range communication in the Industrial, Scientific and Medical Radio band (ISM band). In this thesis, we consider adapting this existing technology to construct shared regional low-powered networks using commercially available ISM band transceivers. To maximize network coverage, we focus on low-power wide-area wireless communication which enables links to reliably cover 10 km or more depending on terrain transmitting up to 1 Watt Equivalent Isotropically Radiated Power (EIRP). With potentially thousands of energy constrained IoT devices vying for extremely limited bandwidth, minimizing network coordination overhead and maximizing channel utility is essential. To address these challenges, we propose a distributed queueing (DQ) based MAC protocol, DQ-N. DQ-N exhibits excellent performance, supporting thousands of IoT devices from a single base station. In the future, these networks could accommodate a heterogeneous set of IoT applications, simplifying the IoT application development cycle, reducing total system cost, improving application reliability, and greatly enhancing the user experience

    Achieving Max-Min Throughput in LoRa Networks

    Full text link
    With growing popularity, LoRa networks are pivotally enabling Long Range connectivity to low-cost and power-constrained user equipments (UEs). Due to its wide coverage area, a critical issue is to effectively allocate wireless resources to support potentially massive UEs in the cell while resolving the prominent near-far fairness problem for cell-edge UEs, which is challenging to address due to the lack of tractable analytical model for the LoRa network and its practical requirement for low-complexity and low-overhead design. To achieve massive connectivity with fairness, we investigate the problem of maximizing the minimum throughput of all UEs in the LoRa network, by jointly designing high-level policies of spreading factor (SF) allocation, power control, and duty cycle adjustment based only on average channel statistics and spatial UE distribution. By leveraging on the Poisson rain model along with tailored modifications to our considered LoRa network, we are able to account for channel fading, aggregate interference and accurate packet overlapping, and still obtain a tractable and yet accurate closed-form formula for the packet success probability and hence throughput. We further propose an iterative balancing (IB) method to allocate the SFs in the cell such that the overall max-min throughput can be achieved within the considered time period and cell area. Numerical results show that the proposed scheme with optimized design greatly alleviates the near-far fairness issue, and significantly improves the cell-edge throughput.Comment: 6 pages, 4 figures, published in Proc. International Conference on Computing, Networking and Communications (ICNC), 2020. This paper proposes stochastic-geometry based analytical framework for a single-cell LoRa network, with joint optimization to achieve max-min throughput for the users. Extended journal version for large-scale multi-cell LoRa network: arXiv:2008.0743

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    • …
    corecore