74 research outputs found

    Collaborative Multi-Resource Allocation in Terrestrial-Satellite Network Towards 6G

    Get PDF
    Terrestrial-satellite networks (TSNs) are envisioned to play a significant role in the sixth-generation (6G) wireless networks. In such networks, hot air balloons are useful as they can relay the signals between satellites and ground stations. Most existing works assume that the hot air balloons are deployed at the same height with the same minimum elevation angle to the satellites, which may not be practical due to possible route conflict with airplanes and other flight equipment. In this paper, we consider a TSN containing hot air balloons at different heights and with different minimum elevation angles, which creates the challenge of non-uniform available serving time for the communication between the hot air balloons and the satellites. Jointly considering the caching, computing, and communication (3C) resource management for both the ground-balloon-satellite links and inter-satellite laser links, our objective is to maximize the network energy efficiency. Firstly, by proposing a tapped water-filling algorithm, we schedule the traffic to relay among satellites according to the available serving time of satellites. Then, we generate a series of configuration matrices, based on which we formulate the relation between relay time and the power consumption involved in the relay among satellites. Finally, the collaborative resource allocation problem for TSN is modeled and solved by geometric programming with Taylor series approximation. Simulation results demonstrate the effectiveness of our proposed scheme

    A Survey of Scheduling in 5G URLLC and Outlook for Emerging 6G Systems

    Get PDF
    Future wireless communication is expected to be a paradigm shift from three basic service requirements of 5th Generation (5G) including enhanced Mobile Broadband (eMBB), Ultra Reliable and Low Latency communication (URLLC) and the massive Machine Type Communication (mMTC). Integration of the three heterogeneous services into a single system is a challenging task. The integration includes several design issues including scheduling network resources with various services. Specially, scheduling the URLLC packets with eMBB and mMTC packets need more attention as it is a promising service of 5G and beyond systems. It needs to meet stringent Quality of Service (QoS) requirements and is used in time-critical applications. Thus through understanding of packet scheduling issues in existing system and potential future challenges is necessary. This paper surveys the potential works that addresses the packet scheduling algorithms for 5G and beyond systems in recent years. It provides state of the art review covering three main perspectives such as decentralised, centralised and joint scheduling techniques. The conventional decentralised algorithms are discussed first followed by the centralised algorithms with specific focus on single and multi-connected network perspective. Joint scheduling algorithms are also discussed in details. In order to provide an in-depth understanding of the key scheduling approaches, the performances of some prominent scheduling algorithms are evaluated and analysed. This paper also provides an insight into the potential challenges and future research directions from the scheduling perspective

    MM-Wave HetNet in 5G and beyond Cellular Networks Reinforcement Learning Method to improve QoS and Exploiting Path Loss Model

    Get PDF
    This paper presents High density heterogeneous networks (HetNet) which are the most promising technology for the fifth generation (5G) cellular network. Since 5G will be available for a long time, previous generation networking systems will need customization and updates. We examine the merits and drawbacks of legacy and Q-Learning (QL)-based adaptive resource allocation systems. Furthermore, various comparisons between methods and schemes are made for the purpose of evaluating the solutions for future generation. Microwave macro cells are used to enable extra high capacity such as Long-Term Evolution (LTE), eNodeB (eNB), and Multimedia Communications Wireless technology (MC), in which they are most likely to be deployed. This paper also presents four scenarios for 5G mm-Wave implementation, including proposed system architectures. The WL algorithm allocates optimal power to the small cell base station (SBS) to satisfy the minimum necessary capacity of macro cell user equipment (MUEs) and small cell user equipment (SCUEs) in order to provide quality of service (QoS) (SUEs). The challenges with dense HetNet and the massive backhaul traffic they generate are discussed in this study. Finally, a core HetNet design based on clusters is aimed at reducing backhaul traffic. According to our findings, MM-wave HetNet and MEC can be useful in a wide range of applications, including ultra-high data rate and low latency communications in 5G and beyond. We also used the channel model simulator to examine the directional power delay profile with received signal power, path loss, and path loss exponent (PLE) for both LOS and NLOS using uniform linear array (ULA) 2X2 and 64x16 antenna configurations at 38 GHz and 73 GHz mmWave bands for both LOS and NLOS (NYUSIM). The simulation results show the performance of several path loss models in the mmWave and sub-6 GHz bands. The path loss in the close-in (CI) model at mmWave bands is higher than that of open space and two ray path loss models because it considers all shadowing and reflection effects between transmitter and receiver. We also compared the suggested method to existing models like Amiri, Su, Alsobhi, Iqbal, and greedy (non adaptive), and found that it not only enhanced MUE and SUE minimum capacities and reduced BT complexity, but it also established a new minimum QoS threshold. We also talked about 6G researches in the future. When compared to utilizing the dual slope route loss model alone in a hybrid heterogeneous network, our simulation findings show that decoupling is more visible when employing the dual slope path loss model, which enhances system performance in terms of coverage and data rate

    one6G white paper, 6G technology overview:Second Edition, November 2022

    Get PDF
    6G is supposed to address the demands for consumption of mobile networking services in 2030 and beyond. These are characterized by a variety of diverse, often conflicting requirements, from technical ones such as extremely high data rates, unprecedented scale of communicating devices, high coverage, low communicating latency, flexibility of extension, etc., to non-technical ones such as enabling sustainable growth of the society as a whole, e.g., through energy efficiency of deployed networks. On the one hand, 6G is expected to fulfil all these individual requirements, extending thus the limits set by the previous generations of mobile networks (e.g., ten times lower latencies, or hundred times higher data rates than in 5G). On the other hand, 6G should also enable use cases characterized by combinations of these requirements never seen before, e.g., both extremely high data rates and extremely low communication latency). In this white paper, we give an overview of the key enabling technologies that constitute the pillars for the evolution towards 6G. They include: terahertz frequencies (Section 1), 6G radio access (Section 2), next generation MIMO (Section 3), integrated sensing and communication (Section 4), distributed and federated artificial intelligence (Section 5), intelligent user plane (Section 6) and flexible programmable infrastructures (Section 7). For each enabling technology, we first give the background on how and why the technology is relevant to 6G, backed up by a number of relevant use cases. After that, we describe the technology in detail, outline the key problems and difficulties, and give a comprehensive overview of the state of the art in that technology. 6G is, however, not limited to these seven technologies. They merely present our current understanding of the technological environment in which 6G is being born. Future versions of this white paper may include other relevant technologies too, as well as discuss how these technologies can be glued together in a coherent system

    Spectrum Allocation with Adaptive Sub-band Bandwidth for Terahertz Communication Systems

    Full text link
    We study spectrum allocation for terahertz (THz) band communication (THzCom) systems, while considering the frequency and distance-dependent nature of THz channels. Different from existing studies, we explore multi-band-based spectrum allocation with adaptive sub-band bandwidth (ASB) by allowing the spectrum of interest to be divided into sub-bands with unequal bandwidths. Also, we investigate the impact of sub-band assignment on multi-connectivity (MC) enabled THzCom systems, where users associate and communicate with multiple access points simultaneously. We formulate resource allocation problems, with the primary focus on spectrum allocation, to determine sub-band assignment, sub-band bandwidth, and optimal transmit power. Thereafter, we propose reasonable approximations and transformations, and develop iterative algorithms based on the successive convex approximation technique to analytically solve the formulated problems. Aided by numerical results, we show that by enabling and optimizing ASB, significantly higher throughput can be achieved as compared to adopting equal sub-band bandwidth, and this throughput gain is most profound when the power budget constraint is more stringent. We also show that our sub-band assignment strategy in MC-enabled THzCom systems outperforms the state-of-the-art sub-band assignment strategies and the performance gain is most profound when the spectrum with the lowest average molecular absorption coefficient is selected during spectrum allocation.Comment: This work has been accepted for publication in IEEE Transaction on Communication

    Survey on 6G Frontiers: Trends, Applications, Requirements, Technologies and Future Research

    Get PDF
    Emerging applications such as Internet of Everything, Holographic Telepresence, collaborative robots, and space and deep-sea tourism are already highlighting the limitations of existing fifth-generation (5G) mobile networks. These limitations are in terms of data-rate, latency, reliability, availability, processing, connection density and global coverage, spanning over ground, underwater and space. The sixth-generation (6G) of mobile networks are expected to burgeon in the coming decade to address these limitations. The development of 6G vision, applications, technologies and standards has already become a popular research theme in academia and the industry. In this paper, we provide a comprehensive survey of the current developments towards 6G. We highlight the societal and technological trends that initiate the drive towards 6G. Emerging applications to realize the demands raised by 6G driving trends are discussed subsequently. We also elaborate the requirements that are necessary to realize the 6G applications. Then we present the key enabling technologies in detail. We also outline current research projects and activities including standardization efforts towards the development of 6G. Finally, we summarize lessons learned from state-of-the-art research and discuss technical challenges that would shed a new light on future research directions towards 6G
    corecore