464 research outputs found

    Five Facets of 6G: Research Challenges and Opportunities

    Full text link
    Whilst the fifth-generation (5G) systems are being rolled out across the globe, researchers have turned their attention to the exploration of radical next-generation solutions. At this early evolutionary stage we survey five main research facets of this field, namely {\em Facet~1: next-generation architectures, spectrum and services, Facet~2: next-generation networking, Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing, as well as Facet~5: applications of deep learning in 6G networks.} In this paper, we have provided a critical appraisal of the literature of promising techniques ranging from the associated architectures, networking, applications as well as designs. We have portrayed a plethora of heterogeneous architectures relying on cooperative hybrid networks supported by diverse access and transmission mechanisms. The vulnerabilities of these techniques are also addressed and carefully considered for highlighting the most of promising future research directions. Additionally, we have listed a rich suite of learning-driven optimization techniques. We conclude by observing the evolutionary paradigm-shift that has taken place from pure single-component bandwidth-efficiency, power-efficiency or delay-optimization towards multi-component designs, as exemplified by the twin-component ultra-reliable low-latency mode of the 5G system. We advocate a further evolutionary step towards multi-component Pareto optimization, which requires the exploration of the entire Pareto front of all optiomal solutions, where none of the components of the objective function may be improved without degrading at least one of the other components

    A CTC and D2D based Network Architecture for Reliable and Energy-Efficient Public Safety Communication

    Get PDF
    Public Safety Communication (PSC) is responsible to provide reliable communications between the first responders and the victims in public safety scenarios. Some state-of-the-art wireless communication technologies, such as Cross-Technology Communication (CTC) and Device-to-Device (D2D) communication, are providing more possibilities of the connectivity amongst different communication devices. For instance, CTC enables communications between heterogeneous wireless devices (e.g. Wi-Fi, ZigBee, and Bluetooth) operating in the same ISM band, and D2D communication allows direct communication between wireless devices without traversing a base station. These features make them be promising to be applied for reliable PSC network establishments. They can replace those traditional wireless communication technologies which are not specially designed for PSC networks. In this research work, we propose a novel PSC network architecture based on CTC and D2D communication technologies. To be specific, we propose a novel device clustering scheme to expand the coverage of the PSC network. Cluster heads and cluster gateways in the scheme are chosen from a group of user equipment (UE) based on particular metrics, e.g., residual battery power, received signal strength indicator, etc. Moreover, we propose a scheduling scheme for managing the UE in our PSC network to improve energy efficiency. The simulation results demonstrate that our proposed PSC network architecture can provide reliable public safety communications with high energy efficiency.https://ecommons.udayton.edu/stander_posters/2532/thumbnail.jp

    MM-Wave HetNet in 5G and beyond Cellular Networks Reinforcement Learning Method to improve QoS and Exploiting Path Loss Model

    Get PDF
    This paper presents High density heterogeneous networks (HetNet) which are the most promising technology for the fifth generation (5G) cellular network. Since 5G will be available for a long time, previous generation networking systems will need customization and updates. We examine the merits and drawbacks of legacy and Q-Learning (QL)-based adaptive resource allocation systems. Furthermore, various comparisons between methods and schemes are made for the purpose of evaluating the solutions for future generation. Microwave macro cells are used to enable extra high capacity such as Long-Term Evolution (LTE), eNodeB (eNB), and Multimedia Communications Wireless technology (MC), in which they are most likely to be deployed. This paper also presents four scenarios for 5G mm-Wave implementation, including proposed system architectures. The WL algorithm allocates optimal power to the small cell base station (SBS) to satisfy the minimum necessary capacity of macro cell user equipment (MUEs) and small cell user equipment (SCUEs) in order to provide quality of service (QoS) (SUEs). The challenges with dense HetNet and the massive backhaul traffic they generate are discussed in this study. Finally, a core HetNet design based on clusters is aimed at reducing backhaul traffic. According to our findings, MM-wave HetNet and MEC can be useful in a wide range of applications, including ultra-high data rate and low latency communications in 5G and beyond. We also used the channel model simulator to examine the directional power delay profile with received signal power, path loss, and path loss exponent (PLE) for both LOS and NLOS using uniform linear array (ULA) 2X2 and 64x16 antenna configurations at 38 GHz and 73 GHz mmWave bands for both LOS and NLOS (NYUSIM). The simulation results show the performance of several path loss models in the mmWave and sub-6 GHz bands. The path loss in the close-in (CI) model at mmWave bands is higher than that of open space and two ray path loss models because it considers all shadowing and reflection effects between transmitter and receiver. We also compared the suggested method to existing models like Amiri, Su, Alsobhi, Iqbal, and greedy (non adaptive), and found that it not only enhanced MUE and SUE minimum capacities and reduced BT complexity, but it also established a new minimum QoS threshold. We also talked about 6G researches in the future. When compared to utilizing the dual slope route loss model alone in a hybrid heterogeneous network, our simulation findings show that decoupling is more visible when employing the dual slope path loss model, which enhances system performance in terms of coverage and data rate

    Neural Network based Non Orthogonal Random Access for 6G NTN-IoT

    Get PDF
    Pervasive and distributed Internet of Things (IoT) devices demand ubiquitous coverage beyond No-man’s land. To satisfy plethora of IoT devices with resilient connectivity, Non-Terrestrial Networks (NTN) will be pivotal to assist and complement terrestrial systems. In a massiveMTC scenario over NTN, characterized by sporadic uplink data reports, all the terminals within a satellite beam shall be served during the short visibility window of the flying platform, thus generating congestion due to simultaneous access attempts of IoT devices on the same radio resource. The more terminals collide, the more average-time it takes to complete an access which is due to the decreased number of successful attempts caused by Back-off commands of legacy methods. A possible countermeasure is represented by Non-Orthogonal Multiple Access scheme, which requires the knowledge of the number of superimposed NPRACH preambles. This work addresses this problem by proposing a Neural Network (NN) algorithm to cope with the uncoordinated random access performed by a prodigious number of Narrowband-IoT devices. Our proposed method classifies the number of colliding users, and for each estimates the Time of Arrival (ToA). The performance assessment, under Line of Sight (LoS) and Non-LoS conditions in sub-urban environments with two different satellite configurations, shows significant benefits of the proposed NN algorithm with respect to traditional methods for the ToA estimation

    Domain Generalization in Machine Learning Models for Wireless Communications: Concepts, State-of-the-Art, and Open Issues

    Full text link
    Data-driven machine learning (ML) is promoted as one potential technology to be used in next-generations wireless systems. This led to a large body of research work that applies ML techniques to solve problems in different layers of the wireless transmission link. However, most of these applications rely on supervised learning which assumes that the source (training) and target (test) data are independent and identically distributed (i.i.d). This assumption is often violated in the real world due to domain or distribution shifts between the source and the target data. Thus, it is important to ensure that these algorithms generalize to out-of-distribution (OOD) data. In this context, domain generalization (DG) tackles the OOD-related issues by learning models on different and distinct source domains/datasets with generalization capabilities to unseen new domains without additional finetuning. Motivated by the importance of DG requirements for wireless applications, we present a comprehensive overview of the recent developments in DG and the different sources of domain shift. We also summarize the existing DG methods and review their applications in selected wireless communication problems, and conclude with insights and open questions
    • …
    corecore