27 research outputs found

    Resource Allocation and Service Management in Next Generation 5G Wireless Networks

    Get PDF
    The accelerated evolution towards next generation networks is expected to dramatically increase mobile data traffic, posing challenging requirements for future radio cellular communications. User connections are multiplying, whilst data hungry content is dominating wireless services putting significant pressure on network's available spectrum. Ensuring energy-efficient and low latency transmissions, while maintaining advanced Quality of Service (QoS) and high standards of user experience are of profound importance in order to address diversifying user prerequisites and ensure superior and sustainable network performance. At the same time, the rise of 5G networks and the Internet of Things (IoT) evolution is transforming wireless infrastructure towards enhanced heterogeneity, multi-tier architectures and standards, as well as new disruptive telecommunication technologies. The above developments require a rethinking of how wireless networks are designed and operate, in conjunction with the need to understand more holistically how users interact with the network and with each other. In this dissertation, we tackle the problem of efficient resource allocation and service management in various network topologies under a user-centric approach. In the direction of ad-hoc and self-organizing networks where the decision making process lies at the user level, we develop a novel and generic enough framework capable of solving a wide array of problems with regards to resource distribution in an adaptable and multi-disciplinary manner. Aiming at maximizing user satisfaction and also achieve high performance - low power resource utilization, the theory of network utility maximization is adopted, with the examined problems being formulated as non-cooperative games. The considered games are solved via the principles of Game Theory and Optimization, while iterative and low complexity algorithms establish their convergence to steady operational outcomes, i.e., Nash Equilibrium points. This thesis consists a meaningful contribution to the current state of the art research in the field of wireless network optimization, by allowing users to control multiple degrees of freedom with regards to their transmission, considering mobile customers and their strategies as the key elements for the amelioration of network's performance, while also adopting novel technologies in the resource management problems. First, multi-variable resource allocation problems are studied for multi-tier architectures with the use of femtocells, addressing the topic of efficient power and/or rate control, while also the topic is examined in Visible Light Communication (VLC) networks under various access technologies. Next, the problem of customized resource pricing is considered as a separate and bounded resource to be optimized under distinct scenarios, which expresses users' willingness to pay instead of being commonly implemented by a central administrator in the form of penalties. The investigation is further expanded by examining the case of service provider selection in competitive telecommunication markets which aim to increase their market share by applying different pricing policies, while the users model the selection process by behaving as learning automata under a Machine Learning framework. Additionally, the problem of resource allocation is examined for heterogeneous services where users are enabled to dynamically pick the modules needed for their transmission based on their preferences, via the concept of Service Bundling. Moreover, in this thesis we examine the correlation of users' energy requirements with their transmission needs, by allowing the adaptive energy harvesting to reflect the consumed power in the subsequent information transmission in Wireless Powered Communication Networks (WPCNs). Furthermore, in this thesis a fresh perspective with respect to resource allocation is provided assuming real life conditions, by modeling user behavior under Prospect Theory. Subjectivity in decisions of users is introduced in situations of high uncertainty in a more pragmatic manner compared to the literature, where they behave as blind utility maximizers. In addition, network spectrum is considered as a fragile resource which might collapse if over-exploited under the principles of the Tragedy of the Commons, allowing hence users to sense risk and redefine their strategies accordingly. The above framework is applied in different cases where users have to select between a safe and a common pool of resources (CPR) i.e., licensed and unlicensed bands, different access technologies, etc., while also the impact of pricing in protecting resource fragility is studied. Additionally, the above resource allocation problems are expanded in Public Safety Networks (PSNs) assisted by Unmanned Aerial Vehicles (UAVs), while also aspects related to network security against malign user behaviors are examined. Finally, all the above problems are thoroughly evaluated and tested via a series of arithmetic simulations with regards to the main characteristics of their operation, as well as against other approaches from the literature. In each case, important performance gains are identified with respect to the overall energy savings and increased spectrum utilization, while also the advantages of the proposed framework are mirrored in the improvement of the satisfaction and the superior Quality of Service of each user within the network. Lastly, the flexibility and scalability of this work allow for interesting applications in other domains related to resource allocation in wireless networks and beyond

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Fine-grained performance analysis of massive MTC networks with scheduling and data aggregation

    Get PDF
    Abstract. The Internet of Things (IoT) represents a substantial shift within wireless communication and constitutes a relevant topic of social, economic, and overall technical impact. It refers to resource-constrained devices communicating without or with low human intervention. However, communication among machines imposes several challenges compared to traditional human type communication (HTC). Moreover, as the number of devices increases exponentially, different network management techniques and technologies are needed. Data aggregation is an efficient approach to handle the congestion introduced by a massive number of machine type devices (MTDs). The aggregators not only collect data but also implement scheduling mechanisms to cope with scarce network resources. This thesis provides an overview of the most common IoT applications and the network technologies to support them. We describe the most important challenges in machine type communication (MTC). We use a stochastic geometry (SG) tool known as the meta distribution (MD) of the signal-to-interference ratio (SIR), which is the distribution of the conditional SIR distribution given the wireless nodes’ locations, to provide a fine-grained description of the per-link reliability. Specifically, we analyze the performance of two scheduling methods for data aggregation of MTC: random resource scheduling (RRS) and channel-aware resource scheduling (CRS). The results show the fraction of users in the network that achieves a target reliability, which is an important aspect to consider when designing wireless systems with stringent service requirements. Finally, the impact on the fraction of MTDs that communicate with a target reliability when increasing the aggregators density is investigated

    View on 5G Architecture: Version 2.0

    Get PDF
    The 5G Architecture Working Group as part of the 5GPPP Initiative is looking at capturing novel trends and key technological enablers for the realization of the 5G architecture. It also targets at presenting in a harmonized way the architectural concepts developed in various projects and initiatives (not limited to 5GPPP projects only) so as to provide a consolidated view on the technical directions for the architecture design in the 5G era. The first version of the white paper was released in July 2016, which captured novel trends and key technological enablers for the realization of the 5G architecture vision along with harmonized architectural concepts from 5GPPP Phase 1 projects and initiatives. Capitalizing on the architectural vision and framework set by the first version of the white paper, this Version 2.0 of the white paper presents the latest findings and analyses with a particular focus on the concept evaluations, and accordingly it presents the consolidated overall architecture design

    Channel Access Management for Massive Cellular IoT Applications

    Get PDF
    As part of the steps taken towards improving the quality of life, many of everyday life activities as well as technological advancements are relying more and more on smart devices. In the future, it is expected that every electric device will be a smart device that can be connected to the internet. This gives rise to the new network paradigm known as the massive cellular IoT, where a large number of simple battery powered heterogeneous devices are collectively working for the betterment of humanity in all aspects. However, different from the traditional cellular based communication networks, IoT applications produce uplink-heavy data traffic that is composed of a large number of small data packets with different quality of service (QoS) requirements. These unique characteristics pose as a challenge to the current cellular channel access process and, hence, new and revolutionary access mechanisms are much needed. These access mechanisms need to be cost-effective, enable the support of massive number of devices, scalable, practical, and energy and radio resource efficient. Furthermore, due to the low computational capabilities of the devices, they cannot handle heavy networking intelligence and, thus, the designed channel access should be simple and light. Accordingly, in this research, we evaluate the suitability of the current channel access mechanism for massive applications and propose an energy efficient and resource preserving clustering and data aggregation solution. The proposed solution is tailored to the needs of future IoT applications. First, we recognize that for many anticipated cellular IoT applications, providing energy efficient and delay-aware access is crucial. However, in cellular networks, before devices transmit their data, they use a contention-based association protocol, known as random access channel procedure (RACH), which introduces extensive access delays and energy wastage as the number of contending devices increases. Modeling the performance of the RACH protocol is a challenging task due to the complexity of uplink transmission that exhibits a wide range of interference components; nonetheless, it is an essential process that helps determine the applicability of cellular IoT communication paradigm and shed light on the main challenges. Consequently, we develop a novel mathematical framework based on stochastic geometry to evaluate the RACH protocol and identify its limitations in the context of cellular IoT applications with a massive number of devices. To do so, we study the traditional cellular association process and establish a mathematical model for its association success probability. The model accounts for device density, spatial characteristics of the network, power control employed, and mutual interference among the devices. Our analysis and results highlight the shortcomings of the RACH protocol and give insights into the potentials brought on by employing power control techniques. Second, based on the analysis of the RACH procedure, we determine that, as the number of devices increases, the contention over the limited network radio resources increases, leading to network congestion. Accordingly, to avoid network congestion while supporting a large number of devices, we propose to use node clustering and data aggregation. As the number of supported devices increases and their QoS requirements become vast, optimizing node clustering and data aggregation processes becomes critical to be able to handle the many trade-offs that arise among different network performance metrics. Furthermore, for cost effectiveness, we propose that the data aggregator nodes be cellular devices and thus it is desirable to keep the number of aggregators to minimum such that we avoid congesting the RACH channel, while maximizing the number of successfully supported devices. Consequently, to tackle these issues, we explore the possibility of combining data aggregation and non-orthogonal multiple access (NOMA) where we propose a novel two-hop NOMA-enabled network architecture. Concepts from queuing theory and stochastic geometry are jointly exploited to derive mathematical expressions for different network performance metrics such as coverage probability, two-hop access delay, and the number of served devices per transmission frame. The established models characterize relations among various network metrics, and hence facilitate the design of two-stage transmission architecture. Numerical results demonstrate that the proposed solution improves the overall access delay and energy efficiency as compared to traditional OMA-based clustered networks. Last, we recognize that under the proposed two-hop network architecture, devices are subject to access point association decisions, i.e., to which access point a device associates plays a major role in determining the overall network performance and the perceived service by the devices. Accordingly, in the third part of the work, we consider the optimization of the two-hop network from the point of view of user association such that the number of QoS satisfied devices is maximized while minimizing the overall device energy consumption. We formulate the problem as a joint access point association, resources utilization, and energy efficient communication optimization problem that takes into account various networking factors such as the number of devices, number of data aggregators, number of available resource units, interference, transmission power limitation of the devices, aggregator transmission performance, and channel conditions. The objective is to show the usefulness of data aggregation and shed light on the importance of network design when the number of devices is massive. We propose a coalition game theory based algorithm, PAUSE, to transform the optimization problem into a simpler form that can be successfully solved in polynomial time. Different network scenarios are simulated to showcase the effectiveness of PAUSE and to draw observations on cost effective data aggregation enabled two-hop network design

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) and Ultra-Reliable and Low Latency Communications (URLLC), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include Quality of Service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenario along with the recent advances towards enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions

    UAV Connectivity over Cellular Networks:Investigation of Command and Control Link Reliability

    Get PDF

    Optimizing performance and energy efficiency of group communication and internet of things in cognitive radio networks

    Get PDF
    Data traffic in the wireless networks has grown at an unprecedented rate. While traditional wireless networks follow fixed spectrum assignment, spectrum scarcity problem becomes a major challenge in the next generations of wireless networks. Cognitive radio is a promising candidate technology that can mitigate this critical challenge by allowing dynamic spectrum access and increasing the spectrum utilization. As users and data traffic demands increases, more efficient communication methods to support communication in general, and group communication in particular, are needed. On the other hand, limited battery for the wireless network device in general makes it a bottleneck for enhancing the performance of wireless networks. In this thesis, the problem of optimizing the performance of group communication in CRNs is studied. Moreover, energy efficient and wireless-powered group communication in CRNs are considered. Additionally, a cognitive mobile base station and a cognitive UAV are proposed for the purpose of optimizing energy transfer and data dissemination, respectively. First, a multi-objective optimization for many-to-many communication in CRNs is considered. Given a many-to-many communication request, the goal is to support message routing from each user in the many-to-many group to each other. The objectives are minimizing the delay and the number of used links and maximizing data rate. The network is modeled using a multi-layer hyper graph, and the secondary users\u27 transmission is scheduled after establishing the conflict graph. Due to the difficulty of solving the problem optimally, a modified version of an Ant Colony meta-heuristic algorithm is employed to solve the problem. Additionally, energy efficient multicast communication in CRNs is introduced while considering directional and omnidirectional antennas. The multicast service is supported such that the total energy consumption of data transmission and channel switching is minimized. The optimization problem is formulated as a Mixed Integer Linear Program (MILP), and a heuristic algorithm is proposed to solve the problem in polynomial time. Second, wireless-powered machine-to-machine multicast communication in cellular networks is studied. To incentivize Internet of Things (IoT) devices to participate in forwarding the multicast messages, each IoT device participates in messages forwarding receives Radio Frequency (RF) energy form Energy Transmitters (ET) not less than the amount of energy used for messages forwarding. The objective is to minimize total transferred energy by the ETs. The problem is formulated mathematically as a Mixed Integer Nonlinear Program (MINLP), and a Generalized Bender Decomposition with Successive Convex Programming (GBD-SCP) algorithm is introduced to get an approximate solution since there is no efficient way in general to solve the problem optimally. Moreover, another algorithm, Constraints Decomposition with SCP and Binary Variable Relaxation (CDR), is proposed to get an approximate solution in a more efficient way. On the other hand, a cognitive mobile station base is proposed to transfer data and energy to a group of IoT devices underlying a primary network. Total energy consumed by the cognitive base station in its mobility, data transmission and energy transfer is minimized. Moreover, the cognitive base station adjusts its location and transmission power and transmission schedule such that data and energy demands are supported within a certain tolerable time and the primary users are protected from harmful interference. Finally, we consider a cognitive Unmanned Aerial Vehicle (UAV) to disseminate data to IoT devices. The UAV senses the spectrum and finds an idle channel, then it predicts when the corresponding primary user of the selected channel becomes active based on the elapsed time of the off period. Accordingly, it starts its transmission at the beginning of the next frame right after finding the channel is idle. Moreover, it decides the number of the consecutive transmission slots that it will use such that the number of interfering slots to the corresponding primary user does not exceed a certain threshold. A mathematical problem is formulated to maximize the minimum number of bits received by the IoT devices. A successive convex programming-based algorithm is used to get a solution for the problem in an efficiency way. It is shown that the used algorithm converges to a Kuhn Tucker point
    corecore