17 research outputs found

    Deep Reinforcement Learning for Real-Time Optimization in NB-IoT Networks

    Get PDF
    NarrowBand-Internet of Things (NB-IoT) is an emerging cellular-based technology that offers a range of flexible configurations for massive IoT radio access from groups of devices with heterogeneous requirements. A configuration specifies the amount of radio resource allocated to each group of devices for random access and for data transmission. Assuming no knowledge of the traffic statistics, there exists an important challenge in "how to determine the configuration that maximizes the long-term average number of served IoT devices at each Transmission Time Interval (TTI) in an online fashion". Given the complexity of searching for optimal configuration, we first develop real-time configuration selection based on the tabular Q-learning (tabular-Q), the Linear Approximation based Q-learning (LA-Q), and the Deep Neural Network based Q-learning (DQN) in the single-parameter single-group scenario. Our results show that the proposed reinforcement learning based approaches considerably outperform the conventional heuristic approaches based on load estimation (LE-URC) in terms of the number of served IoT devices. This result also indicates that LA-Q and DQN can be good alternatives for tabular-Q to achieve almost the same performance with much less training time. We further advance LA-Q and DQN via Actions Aggregation (AA-LA-Q and AA-DQN) and via Cooperative Multi-Agent learning (CMA-DQN) for the multi-parameter multi-group scenario, thereby solve the problem that Q-learning agents do not converge in high-dimensional configurations. In this scenario, the superiority of the proposed Q-learning approaches over the conventional LE-URC approach significantly improves with the increase of configuration dimensions, and the CMA-DQN approach outperforms the other approaches in both throughput and training efficiency

    Modeling, Analysis, and Optimization of Grant-Free NOMA in Massive MTC via Stochastic Geometry

    Full text link
    Massive machine-type communications (mMTC) is a crucial scenario to support booming Internet of Things (IoTs) applications. In mMTC, although a large number of devices are registered to an access point (AP), very few of them are active with uplink short packet transmission at the same time, which requires novel design of protocols and receivers to enable efficient data transmission and accurate multi-user detection (MUD). Aiming at this problem, grant-free non-orthogonal multiple access (GF-NOMA) protocol is proposed. In GF-NOMA, active devices can directly transmit their preambles and data symbols altogether within one time frame, without grant from the AP. Compressive sensing (CS)-based receivers are adopted for non-orthogonal preambles (NOP)-based MUD, and successive interference cancellation is exploited to decode the superimposed data signals. In this paper, we model, analyze, and optimize the CS-based GF-MONA mMTC system via stochastic geometry (SG), from an aspect of network deployment. Based on the SG network model, we first analyze the success probability as well as the channel estimation error of the CS-based MUD in the preamble phase and then analyze the average aggregate data rate in the data phase. As IoT applications highly demands low energy consumption, low infrastructure cost, and flexible deployment, we optimize the energy efficiency and AP coverage efficiency of GF-NOMA via numerical methods. The validity of our analysis is verified via Monte Carlo simulations. Simulation results also show that CS-based GF-NOMA with NOP yields better MUD and data rate performances than contention-based GF-NOMA with orthogonal preambles and CS-based grant-free orthogonal multiple access.Comment: This paper is submitted to IEEE Internet Of Things Journa

    Analysis of Random Access in NB-IoT Networks with Three Coverage Enhancement Groups: A Stochastic Geometry Approach

    Get PDF
    NarrowBand-Internet of Things (NB-IoT) is a new 3GPP radio access technology designed to provide better coverage for Low Power Wide Area (LPWA) networks. To provide reliable connections with extended coverage, a repetition transmission scheme and up to three Coverage Enhancement (CE) groups are introduced into NB-IoT during both Random Access CHannel (RACH) procedure and data transmission procedure, where each CE group is configured with different repetition values and transmission resources. To characterize the RACH performance of the NB-IoT network with three CE groups, this paper develops a novel traffic-aware spatio-temporal model to analyze the RACH success probability, where both the preamble transmission outage and the collision events of each CE group jointly determine the traffic evolution and the RACH success probability. Based on this analytical model, we derive the analytical expression for the RACH success probability of a randomly chosen IoT device in each CE group over multiple time slots with different RACH schemes, including baseline, back-off (BO), access class barring (ACB), and hybrid ACB and BO schemes (ACB&BO). Our results have shown that the RACH success probabilities of the devices in three CE groups outperform that of a single CE group network but not for all the groups, which is affected by the choice of the categorizing parameters.This mathematical model and analytical framework can be applied to evaluate the performance of multiple group users of other networks with spatial separations

    Spectrum Sharing for Massive Access in Ultra-Narrowband IoT Systems

    Get PDF
    Ultra-narrowband (UNB) communications has become a signature feature for many emerging low-power wide-area (LPWA) networks. Specifically, using extremely narrowband signals helps the network connect more Internet-of-things (IoT) devices within a given band. It also improves robustness to interference, extending the coverage of the network. In this paper, we study the coexistence capability of UNB networks and their scalability to enable massive access. To this end, we develop a stochastic geometry framework to analyze and model UNB networks on a large scale. The framework captures the unique characteristics of UNB communications, including the asynchronous time-frequency access, signal repetition, and the absence of base station (BS) association. Closed-form expressions of the transmission success probability and network connection density are presented for several UNB protocols. We further discuss multiband access for UNB networks, proposing a low-complexity protocol. Our analysis reveals several insights on the geographical diversity achieved when devices do not connect to a single BS, the optimal number of signal repetitions, and how to utilize multiple bands without increasing the complexity of BSs. Simulation results are provided to validate the analysis, and they show that UNB communications enables a single BS to connect thousands of devices even when the spectrum is shared with other networks.Comment: This paper is accepted for publication in the IEEE Journal on Selected Areas in Communications. arXiv admin note: text overlap with arXiv:1811.1109

    Spectrum Sharing for Massive Access in Ultra-Narrowband IoT Systems

    Get PDF

    A Resource Allocation Scheme for Packet Delay Minimization in Multi-Tier Cellular-Based IoT Networks

    Get PDF
    With advances in Internet of Things (IoT) technologies, billions of devices are becoming connected, which can result in the unprecedented sensing and control of the physical environments. IoT devices have diverse quality of service (QoS) requirements, including data rate, latency, reliability, and energy consumption. Meeting the diverse QoS requirements presents great challenges to existing fifth-generation (5G) cellular networks, especially in unprecedented scenarios in 5G networks, such as connected vehicle networks, where strict data packet latency may be required. The IoT devices with these scenarios have higher requirements on the packet latency in networking, which is essential to the utilization of 5G networks. In this paper, we propose a multi-tier cellular-based IoT network to address this challenge, with a particular focus on meeting application latency requirements. In the multi-tier network, access points (APs) can relay and forward packets from IoT devices or other APs, which can support higher data rates with multi-hops between IoT devices and cellular base stations. However, as multiple-hop relaying may cause additional delay, which is crucial to delay-sensitive applications, we develop new schemes to mitigate the adverse impact. Firstly, we design a traffic-prioritization scheduling scheme to classify packets with different priorities in each AP based on the age of information (AoI). Then, we design different channel-access protocols for the transmission of packets according to their priorities to ensure the QoS in networking and the effective utilization of the limited network resources. A queuing-theory-based theoretical model is proposed to analyze the packet delay for each type of packet at each tier of the multi-tier IoT networks. An optimal algorithm for the distribution of spectrum and power resources is developed to reduce the overall packet delay in a multi-tier way. The numerical results achieved in a two-tier cellular-based IoT network show that the target packet delay for delay-sensitive applications can be achieved without a large cost in terms of traffic fairness
    corecore