7 research outputs found

    Performance Enhancement of IEEE 802.11AX in Ultra-Dense Wireless Networks

    Get PDF
    IEEE 802.11ax, which is one emerging WLAN standard, aims at providing highly efficient communication in ultra-dense wireless networks. However, due to a large number of stations (STAs) in dense deployment scenarios and diverse services to be supported, there are many technical challenges to be overcome. Firstly, the potential high packet collision rate significantly degrades the network efficiency of WLAN. In this thesis, we propose an adaptive station (STA) grouping scheme to overcome this challenge in IEEE 802.11ax using Uplink OFDMA Random Access (UORA). In order to achieve optimal utilization efficiency of resource units (RUs), we first analyze the relationship between group size and RU efficiency. Based on this result, an adaptive STA grouping algorithm is proposed to cope with the performance fluctuation of 802.11ax due to remainder stations after grouping. The analysis and simulation results demonstrate that our adaptive grouping algorithm dramatically improves the performance of both the overall system and each STA in the ultra-dense wireless network. Meanwhile, due to the limited RU efficiency of UORA, we adopt the proposed grouping scheme in the Buffer State Report (BSR) based two-stage mechanism (BTM) to enhance the Uplink (UL) Multi-user (MU) access in 802.11ax. Then we propose an adaptive BTM grouping scheme. The analysis results of average RU for each STA, average throughput of the whole system and each STA are derived. The numerical results show that the proposed adaptive grouping scheme provides 2.55, 413.02 and 3712.04 times gains in throughput compared with the UORA grouping, conventional BTM, and conventional UORA, respectively. Furthermore, in order to provide better QoS experience in the ultra-dense network with diverse IoT services, we propose a Hybrid BTM Grouping algorithm to guarantee the QoS requirement from high priority STAs. The concept of ``QoS Utility is introduced to evaluate the satisfaction of transmission. The numerical results demonstrate that the proposed Hybrid BTM grouping scheme has better performance in BSR delivery rate as well as QoS utility than the conventional BTM grouping

    Reinforcement-Learning-Enabled Massive Internet of Things for 6G Wireless Communications

    Get PDF
    Recently, extensive research efforts have been devoted to developing beyond fifth generation (B5G), also referred to as sixth generation (6G) wireless networks aimed at bringing ultra-reli-able low-latency communication services. 6G is expected to extend 5G capabilities to higher communication levels where numerous connected devices and sensors can operate seamlessly. One of the major research focuses of 6G is to enable massive Internet of Things (mIoT) applications. Like Wi-Fi 6 (IEEE 802.11ax), forthcoming wireless communication networks are likely to meet massively deployed devices and extremely new smart applications such as smart cities for mIoT. However, channel scarcity is still present due to a massive number of connected devices accessing the common spectrum resources. With this expectation, next-generation Wi-Fi 6 and beyond for mIoT are anticipated to have inherent machine intelligence capabilities to access the optimum channel resources for their performance optimization. Unfortunately, current wireless communication network standards do not support the ensuing needs of machine learning (ML)-aware frameworks in terms of resource allocation optimization. Keeping such an issue in mind, we propose a reinforcement-learning-based, one of the ML techniques, a framework for a wireless channel access mechanism for IEEE 802.11 standards (i.e., Wi-Fi) in mIoT. The proposed mechanism suggests exploiting a practically measured channel collision probability as a collected dataset from the wireless environment to select optimal resource allocation in mIoT for upcoming 6G wireless communications

    URLLC for 5G and Beyond: Requirements, Enabling Incumbent Technologies and Network Intelligence

    Get PDF
    The tactile internet (TI) is believed to be the prospective advancement of the internet of things (IoT), comprising human-to-machine and machine-to-machine communication. TI focuses on enabling real-time interactive techniques with a portfolio of engineering, social, and commercial use cases. For this purpose, the prospective 5{th} generation (5G) technology focuses on achieving ultra-reliable low latency communication (URLLC) services. TI applications require an extraordinary degree of reliability and latency. The 3{rd} generation partnership project (3GPP) defines that URLLC is expected to provide 99.99% reliability of a single transmission of 32 bytes packet with a latency of less than one millisecond. 3GPP proposes to include an adjustable orthogonal frequency division multiplexing (OFDM) technique, called 5G new radio (5G NR), as a new radio access technology (RAT). Whereas, with the emergence of a novel physical layer RAT, the need for the design for prospective next-generation technologies arises, especially with the focus of network intelligence. In such situations, machine learning (ML) techniques are expected to be essential to assist in designing intelligent network resource allocation protocols for 5G NR URLLC requirements. Therefore, in this survey, we present a possibility to use the federated reinforcement learning (FRL) technique, which is one of the ML techniques, for 5G NR URLLC requirements and summarizes the corresponding achievements for URLLC. We provide a comprehensive discussion of MAC layer channel access mechanisms that enable URLLC in 5G NR for TI. Besides, we identify seven very critical future use cases of FRL as potential enablers for URLLC in 5G NR

    Advanced Technologies Enabling Unlicensed Spectrum Utilization in Cellular Networks

    Get PDF
    As the rapid progress and pleasant experience of Internet-based services, there is an increasing demand for high data rate in wireless communications systems. Unlicensed spectrum utilization in Long Term Evolution (LTE) networks is a promising technique to meet the massive traffic demand. There are two effective methods to use unlicensed bands for delivering LTE traffic. One is offloading LTE traffic toWi-Fi. An alternative method is LTE-unlicensed (LTE-U), which aims to directly use LTE protocols and infrastructures over the unlicensed spectrum. It has also been pointed out that addressing the above two methods simultaneously could further improve the system performance. However, how to avoid severe performance degradation of the Wi-Fi network is a challenging issue of utilizing unlicensed spectrum in LTE networks. Specifically, first, the inter-system spectrum sharing, or, more specifically, the coexistence of LTE andWi-Fi in the same unlicensed spectrum is the major challenge of implementing LTE-U. Second, to use the LTE and Wi-Fi integration approach, mobile operators have to manage two disparate networks in licensed and unlicensed spectrum. Third, optimization for joint data offloading to Wi-Fi and LTE-U in multi- cell scenarios poses more challenges because inter-cell interference must be addressed. This thesis focuses on solving problems related to these challenges. First, the effect of bursty traffic in an LTE and Wi-Fi aggregation (LWA)-enabled network has been investigated. To enhance resource efficiency, the Wi-Fi access point (AP) is designed to operate in both the native mode and the LWA mode simultaneously. Specifically, the LWA-modeWi-Fi AP cooperates with the LTE base station (BS) to transmit bearers to the LWA user, which aggregates packets from both LTE and Wi-Fi. The native-mode Wi-Fi AP transmits Wi-Fi packets to those native Wi-Fi users that are not with LWA capability. This thesis proposes a priority-based Wi-Fi transmission scheme with congestion control and studied the throughput of the native Wi-Fi network, as well as the LWA user delay when the native Wi-Fi user is under heavy traffic conditions. The results provide fundamental insights in the throughput and delay behavior of the considered network. Second, the above work has been extended to larger topologies. A stochastic geometry model has been used to model and analyze the performance of an MPTCP Proxy-based LWA network with intra-tier and cross-tier dependence. Under the considered network model and the activation conditions of LWA-mode Wi-Fi, this thesis has obtained three approximations for the density of active LWA-mode Wi-Fi APs through different approaches. Tractable analysis is provided for the downlink (DL) performance evaluation of large-scale LWA networks. The impact of different parameters on the network performance have been analyzed, validating the significant gain of using LWA in terms of boosted data rate and improved spectrum reuse. Third, this thesis also takes a significant step of analyzing joint multi-cell LTE-U and Wi-Fi network, while taking into account different LTE-U and Wi-Fi inter-working schemes. In particular, two technologies enabling data offloading from LTE to Wi-Fi are considered, including LWA and Wi-Fi offloading in the context of the power gain-based user offloading scheme. The LTE cells in this work are subject to load-coupling due to inter-cell interference. New system frameworks for maximizing the demand scaling factor for all users in both Wi-Fi and multi-cell LTE networks have been proposed. The potential of networks is explored in achieving optimal capacity with arbitrary topologies, accounting for both resource limits and inter-cell interference. Theoretical analyses have been proposed for the proposed optimization problems, resulting in algorithms that achieve global optimality. Numerical results show the algorithms’ effectiveness and benefits of joint use of data offloading and the direct use of LTE over the unlicensed band. All the derived results in this thesis have been validated by Monte Carlo simulations in Matlab, and the conclusions observed from the results can provide guidelines for the future unlicensed spectrum utilization in LTE networks

    Link Scheduling Algorithms For In-Band Full-Duplex Wireless Networks

    Get PDF
    In the last two decades, wireless networks and their corresponding data traffic have grown significantly. This is because wireless networks have become an indispens- able and critical communication infrastructure in a modern society. An on-going challenge in communication systems is meeting the continuous increase in traffic de- mands. This is driven by the proliferation of electronic devices such as smartphones with a WiFi interface along with their bandwidth intensive applications. Moreover, in the near future, sensor devices that form the Internet of Things (IoTs) ecosystem will also add to future traffic growth. One promising approach to meet growing traffic demands is to equip nodes with an In-band-Full-Duplex (IBFD) radio. This radio thus allows nodes to transmit and receive data concurrently over the same frequency band. Another approach to in- crease network or link capacity is to exploit the benefits of Multiple-Input-Multiple- Output (MIMO) technologies; namely, (i) spatial diversity gain, which improves Signal-to-Noise Ratio (SNR) and thus has a direct impact on the data rate used by nodes, and (ii) spatial multiplexing gain, whereby nodes are able to form concurrent links to neighbors

    Cognitive backoff mechanism for IEEE802.11ax high-efficiency WLANs

    No full text
    corecore