57 research outputs found

    Design and Performance Analysis of Next Generation Heterogeneous Cellular Networks for the Internet of Things

    Get PDF
    The Internet of Things (IoT) is a system of inter-connected computing devices, objects and mechanical and digital machines, and the communications between these devices/objects and other Internet-enabled systems. Scalable, reliable, and energy-efficient IoT connectivity will bring huge benefits to the society, especially in transportation, connected self-driving vehicles, healthcare, education, smart cities, and smart industries. The objective of this dissertation is to model and analyze the performance of large-scale heterogeneous two-tier IoT cellular networks, and offer design insights to maximize their performance. Using stochastic geometry, we develop realistic yet tractable models to study the performance of such networks. In particular, we propose solutions to the following research problems: -We propose a novel analytical model to estimate the mean uplink device data rate utility function under both spectrum allocation schemes, full spectrum reuse (FSR) and orthogonal spectrum partition (OSP), for uplink two-hop IoT networks. We develop constraint gradient ascent optimization algorithms to obtain the optimal aggregator association bias (for the FSR scheme) and the optimal joint spectrum partition ratio and optimal aggregator association bias (for the OSP scheme). -We study the performance of two-tier IoT cellular networks in which one tier operates in the traditional sub-6GHz spectrum and the other, in the millimeter wave (mm-wave) spectrum. In particular, we characterize the meta distributions of the downlink signal-to-interference ratio (sub-6GHz spectrum), the signal-to-noise ratio (mm-wave spectrum) and the data rate of a typical device in such a hybrid spectrum network. Finally, we characterize the meta distributions of the SIR/SNR and data rate of a typical device by substituting the cumulative moment of the CSP of a user device into the Gil-Pelaez inversion theorem. -We propose to split the control plane (C-plane) and user plane (U-plane) as a potential solution to harvest densification gain in heterogeneous two-tier networks while minimizing the handover rate and network control overhead. We develop a tractable mobility-aware model for a two-tier downlink cellular network with high density small cells and a C-plane/U-plane split architecture. The developed model is then used to quantify effect of mobility on the foreseen densification gain with and without C-plane/U-plane splitting

    Fine-grained performance analysis of massive MTC networks with scheduling and data aggregation

    Get PDF
    Abstract. The Internet of Things (IoT) represents a substantial shift within wireless communication and constitutes a relevant topic of social, economic, and overall technical impact. It refers to resource-constrained devices communicating without or with low human intervention. However, communication among machines imposes several challenges compared to traditional human type communication (HTC). Moreover, as the number of devices increases exponentially, different network management techniques and technologies are needed. Data aggregation is an efficient approach to handle the congestion introduced by a massive number of machine type devices (MTDs). The aggregators not only collect data but also implement scheduling mechanisms to cope with scarce network resources. This thesis provides an overview of the most common IoT applications and the network technologies to support them. We describe the most important challenges in machine type communication (MTC). We use a stochastic geometry (SG) tool known as the meta distribution (MD) of the signal-to-interference ratio (SIR), which is the distribution of the conditional SIR distribution given the wireless nodes’ locations, to provide a fine-grained description of the per-link reliability. Specifically, we analyze the performance of two scheduling methods for data aggregation of MTC: random resource scheduling (RRS) and channel-aware resource scheduling (CRS). The results show the fraction of users in the network that achieves a target reliability, which is an important aspect to consider when designing wireless systems with stringent service requirements. Finally, the impact on the fraction of MTDs that communicate with a target reliability when increasing the aggregators density is investigated

    Channel Access Management for Massive Cellular IoT Applications

    Get PDF
    As part of the steps taken towards improving the quality of life, many of everyday life activities as well as technological advancements are relying more and more on smart devices. In the future, it is expected that every electric device will be a smart device that can be connected to the internet. This gives rise to the new network paradigm known as the massive cellular IoT, where a large number of simple battery powered heterogeneous devices are collectively working for the betterment of humanity in all aspects. However, different from the traditional cellular based communication networks, IoT applications produce uplink-heavy data traffic that is composed of a large number of small data packets with different quality of service (QoS) requirements. These unique characteristics pose as a challenge to the current cellular channel access process and, hence, new and revolutionary access mechanisms are much needed. These access mechanisms need to be cost-effective, enable the support of massive number of devices, scalable, practical, and energy and radio resource efficient. Furthermore, due to the low computational capabilities of the devices, they cannot handle heavy networking intelligence and, thus, the designed channel access should be simple and light. Accordingly, in this research, we evaluate the suitability of the current channel access mechanism for massive applications and propose an energy efficient and resource preserving clustering and data aggregation solution. The proposed solution is tailored to the needs of future IoT applications. First, we recognize that for many anticipated cellular IoT applications, providing energy efficient and delay-aware access is crucial. However, in cellular networks, before devices transmit their data, they use a contention-based association protocol, known as random access channel procedure (RACH), which introduces extensive access delays and energy wastage as the number of contending devices increases. Modeling the performance of the RACH protocol is a challenging task due to the complexity of uplink transmission that exhibits a wide range of interference components; nonetheless, it is an essential process that helps determine the applicability of cellular IoT communication paradigm and shed light on the main challenges. Consequently, we develop a novel mathematical framework based on stochastic geometry to evaluate the RACH protocol and identify its limitations in the context of cellular IoT applications with a massive number of devices. To do so, we study the traditional cellular association process and establish a mathematical model for its association success probability. The model accounts for device density, spatial characteristics of the network, power control employed, and mutual interference among the devices. Our analysis and results highlight the shortcomings of the RACH protocol and give insights into the potentials brought on by employing power control techniques. Second, based on the analysis of the RACH procedure, we determine that, as the number of devices increases, the contention over the limited network radio resources increases, leading to network congestion. Accordingly, to avoid network congestion while supporting a large number of devices, we propose to use node clustering and data aggregation. As the number of supported devices increases and their QoS requirements become vast, optimizing node clustering and data aggregation processes becomes critical to be able to handle the many trade-offs that arise among different network performance metrics. Furthermore, for cost effectiveness, we propose that the data aggregator nodes be cellular devices and thus it is desirable to keep the number of aggregators to minimum such that we avoid congesting the RACH channel, while maximizing the number of successfully supported devices. Consequently, to tackle these issues, we explore the possibility of combining data aggregation and non-orthogonal multiple access (NOMA) where we propose a novel two-hop NOMA-enabled network architecture. Concepts from queuing theory and stochastic geometry are jointly exploited to derive mathematical expressions for different network performance metrics such as coverage probability, two-hop access delay, and the number of served devices per transmission frame. The established models characterize relations among various network metrics, and hence facilitate the design of two-stage transmission architecture. Numerical results demonstrate that the proposed solution improves the overall access delay and energy efficiency as compared to traditional OMA-based clustered networks. Last, we recognize that under the proposed two-hop network architecture, devices are subject to access point association decisions, i.e., to which access point a device associates plays a major role in determining the overall network performance and the perceived service by the devices. Accordingly, in the third part of the work, we consider the optimization of the two-hop network from the point of view of user association such that the number of QoS satisfied devices is maximized while minimizing the overall device energy consumption. We formulate the problem as a joint access point association, resources utilization, and energy efficient communication optimization problem that takes into account various networking factors such as the number of devices, number of data aggregators, number of available resource units, interference, transmission power limitation of the devices, aggregator transmission performance, and channel conditions. The objective is to show the usefulness of data aggregation and shed light on the importance of network design when the number of devices is massive. We propose a coalition game theory based algorithm, PAUSE, to transform the optimization problem into a simpler form that can be successfully solved in polynomial time. Different network scenarios are simulated to showcase the effectiveness of PAUSE and to draw observations on cost effective data aggregation enabled two-hop network design

    Massive Machine Type Communication with Data Aggregation and Resource Scheduling

    Get PDF
    To enable massive machine type communication (mMTC), data aggregation is a promising approach to reduce the congestion caused by a massive number of machine type devices (MTDs). In this paper, we consider a two-phase cellular-based mMTC network, where MTDs transmit to aggregators (i.e., aggregation phase) and the aggregated data is then relayed to base stations (i.e., relaying phase). Due to the limited resources, the aggregators not only aggregate data, but also schedule resources among MTDs. We consider two scheduling schemes: random resource scheduling (RRS) and channel-aware resource scheduling (CRS). By leveraging the stochastic geometry, we present a tractable analytical framework to investigate the signal-to-interference ratio (SIR) for each phase, thereby computing the MTD success probability, the average number of successful MTDs and probability of successful channel utilization, which are the key metrics characterizing the overall mMTC performance. Our numerical results show that, although the CRS outperforms the RRS in terms of SIR at the aggregation phase, the simpler RRS has almost the same performance as the CRS for most of the cases with regards to the overall mMTC performance. Furthermore, the provision of more resources at the aggregation phase is not always beneficial to the mMTC performance.This work was supported by the Australian Research Council’s Discovery Project Funding Scheme (Project number DP170100939)

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Models and Methods for Network Selection and Balancing in Heterogeneous Scenarios

    Get PDF
    The outbreak of 5G technologies for wireless communications can be considered a response to the need for widespread coverage, in terms of connectivity and bandwidth, to guarantee broadband services, such as streaming or on-demand programs offered by the main television networks or new generation services based on augmented and virtual reality (AR / VR). The purpose of the study conducted for this thesis aims to solve two of the main problems that will occur with the outbreak of 5G, that is, the search for the best possible connectivity, in order to offer users the resources necessary to take advantage of the new generation services, and multicast as required by the eMBMS. The aim of the thesis is the search for innovative algorithms that will allow to obtain the best connectivity to offer users the resources necessary to use the 5G services in a heterogeneous scenario. Study UF that allows you to improve the search for the best candidate network and to achieve a balance that allows you to avoid congestion of the chosen networks. To achieve these two important focuses, I conducted a study on the main mathematical methods that made it possible to select the network based on QoS parameters based on the type of traffic made by users. A further goal was to improve the computational computation performance they present. Furthermore, I carried out a study in order to obtain an innovative algorithm that would allow the management of multicast. The algorithm that has been implemented responds to the needs present in the eMBMS, in realistic scenarios
    corecore