1,974 research outputs found
Analysis and Design of Non-Orthogonal Multiple Access (NOMA) Techniques for Next Generation Wireless Communication Systems
The current surge in wireless connectivity, anticipated to amplify significantly in future wireless technologies, brings a new wave of users. Given the impracticality of an endlessly expanding bandwidth, there’s a pressing need for communication techniques that efficiently serve this burgeoning user base with limited resources. Multiple Access (MA) techniques, notably Orthogonal Multiple Access (OMA), have long addressed bandwidth constraints. However, with escalating user numbers, OMA’s orthogonality becomes limiting for emerging wireless technologies. Non-Orthogonal Multiple Access (NOMA), employing superposition coding, serves more users within the same bandwidth as OMA by allocating different power levels to users whose signals can then be detected using the gap between them, thus offering superior spectral efficiency and massive connectivity. This thesis examines the integration of NOMA techniques with cooperative relaying, EXtrinsic Information Transfer (EXIT) chart analysis, and deep learning for enhancing 6G and beyond communication systems. The adopted methodology aims to optimize the systems’ performance, spanning from bit-error rate (BER) versus signal to noise ratio (SNR) to overall system efficiency and data rates. The primary focus of this thesis is the investigation of the integration of NOMA with cooperative relaying, EXIT chart analysis, and deep learning techniques. In the cooperative relaying context, NOMA notably improved diversity gains, thereby proving the superiority of combining NOMA with cooperative relaying over just NOMA. With EXIT chart analysis, NOMA achieved low BER at mid-range SNR as well as achieved optimal user fairness in the power allocation stage. Additionally, employing a trained neural network enhanced signal detection for NOMA in the deep learning scenario, thereby producing a simpler signal detection for NOMA which addresses NOMAs’ complex receiver problem
Split Federated Learning for 6G Enabled-Networks: Requirements, Challenges and Future Directions
Sixth-generation (6G) networks anticipate intelligently supporting a wide
range of smart services and innovative applications. Such a context urges a
heavy usage of Machine Learning (ML) techniques, particularly Deep Learning
(DL), to foster innovation and ease the deployment of intelligent network
functions/operations, which are able to fulfill the various requirements of the
envisioned 6G services. Specifically, collaborative ML/DL consists of deploying
a set of distributed agents that collaboratively train learning models without
sharing their data, thus improving data privacy and reducing the
time/communication overhead. This work provides a comprehensive study on how
collaborative learning can be effectively deployed over 6G wireless networks.
In particular, our study focuses on Split Federated Learning (SFL), a technique
recently emerged promising better performance compared with existing
collaborative learning approaches. We first provide an overview of three
emerging collaborative learning paradigms, including federated learning, split
learning, and split federated learning, as well as of 6G networks along with
their main vision and timeline of key developments. We then highlight the need
for split federated learning towards the upcoming 6G networks in every aspect,
including 6G technologies (e.g., intelligent physical layer, intelligent edge
computing, zero-touch network management, intelligent resource management) and
6G use cases (e.g., smart grid 2.0, Industry 5.0, connected and autonomous
systems). Furthermore, we review existing datasets along with frameworks that
can help in implementing SFL for 6G networks. We finally identify key technical
challenges, open issues, and future research directions related to SFL-enabled
6G networks
Integration of hybrid networks, AI, Ultra Massive-MIMO, THz frequency, and FBMC modulation toward 6g requirements : A Review
The fifth-generation (5G) wireless communications have been deployed in many countries with the following features: wireless networks at 20 Gbps as peak data rate, a latency of 1-ms, reliability of 99.999%, maximum mobility of 500 km/h, a bandwidth of 1-GHz, and a capacity of 106 up to Mbps/m2. Nonetheless, the rapid growth of applications, such as extended/virtual reality (XR/VR), online gaming, telemedicine, cloud computing, smart cities, the Internet of Everything (IoE), and others, demand lower latency, higher data rates, ubiquitous coverage, and better reliability. These higher requirements are the main problems that have challenged 5G while concurrently encouraging researchers and practitioners to introduce viable solutions. In this review paper, the sixth-generation (6G) technology could solve the 5G limitations, achieve higher requirements, and support future applications. The integration of multiple access techniques, terahertz (THz), visible light communications (VLC), ultra-massive multiple-input multiple-output ( ÎĽm -MIMO), hybrid networks, cell-free massive MIMO, and artificial intelligence (AI)/machine learning (ML) have been proposed for 6G. The main contributions of this paper are a comprehensive review of the 6G vision, KPIs (key performance indicators), and advanced potential technologies proposed with operation principles. Besides, this paper reviewed multiple access and modulation techniques, concentrating on Filter-Bank Multicarrier (FBMC) as a potential technology for 6G. This paper ends by discussing potential applications with challenges and lessons identified from prior studies to pave the path for future research
Rate-splitting multiple access for non-terrestrial communication and sensing networks
Rate-splitting multiple access (RSMA) has emerged as a powerful and flexible
non-orthogonal transmission, multiple access (MA) and interference management
scheme for future wireless networks. This thesis is concerned with the application of
RSMA to non-terrestrial communication and sensing networks. Various scenarios
and algorithms are presented and evaluated.
First, we investigate a novel multigroup/multibeam multicast beamforming strategy
based on RSMA in both terrestrial multigroup multicast and multibeam satellite
systems with imperfect channel state information at the transmitter (CSIT). The
max-min fairness (MMF)-degree of freedom (DoF) of RSMA is derived and shown
to provide gains compared with the conventional strategy. The MMF beamforming
optimization problem is formulated and solved using the weighted minimum mean
square error (WMMSE) algorithm. Physical layer design and link-level simulations
are also investigated. RSMA is demonstrated to be very promising for multigroup
multicast and multibeam satellite systems taking into account CSIT uncertainty
and practical challenges in multibeam satellite systems.
Next, we extend the scope of research from multibeam satellite systems to satellite-
terrestrial integrated networks (STINs). Two RSMA-based STIN schemes are
investigated, namely the coordinated scheme relying on CSI sharing and the co-
operative scheme relying on CSI and data sharing. Joint beamforming algorithms
are proposed based on the successive convex approximation (SCA) approach to
optimize the beamforming to achieve MMF amongst all users. The effectiveness and
robustness of the proposed RSMA schemes for STINs are demonstrated.
Finally, we consider RSMA for a multi-antenna integrated sensing and communications (ISAC) system, which simultaneously serves multiple communication users
and estimates the parameters of a moving target. Simulation results demonstrate
that RSMA is beneficial to both terrestrial and multibeam satellite ISAC systems by
evaluating the trade-off between communication MMF rate and sensing Cramer-Rao
bound (CRB).Open Acces
A Memory-Efficient Learning Framework for Symbol Level Precoding with Quantized NN Weights
This paper proposes a memory-efficient deep neural network (DNN) framework-based symbol level precoding (SLP). We focus on a DNN with realistic finite precision weights and adopt an unsupervised deep learning (DL) based SLP model (SLP-DNet). We apply a stochastic quantization (SQ) technique to obtain its corresponding quantized version called SLP-SQDNet. The proposed scheme offers a scalable performance vs memory trade-off, by quantizing a scalable percentage of the DNN weights, and we explore binary and ternary quantizations. Our results show that while SLP-DNet provides near-optimal performance, its quantized versions through SQ yield ~3.46× and ~2.64× model compression for binary-based and ternary-based SLP-SQDNets, respectively. We also find that our proposals offer ~20× and ~10× computational complexity reductions compared to SLP optimization-based and SLP-DNet, respectively
BDS GNSS for Earth Observation
For millennia, human communities have wondered about the possibility of observing
phenomena in their surroundings, and in particular those affecting the Earth on which they live.
More generally, it can be conceptually defined as Earth observation (EO) and is the collection of
information about the biological, chemical and physical systems of planet Earth. It can be undertaken
through sensors in direct contact with the ground or airborne platforms (such as weather balloons and
stations) or remote-sensing technologies. However, the definition of EO has only become significant
in the last 50 years, since it has been possible to send artificial satellites out of Earth’s orbit.
Referring strictly to civil applications, satellites of this type were initially designed to provide
satellite images; later, their purpose expanded to include the study of information on land
characteristics, growing vegetation, crops, and environmental pollution. The data collected are used
for several purposes, including the identification of natural resources and the production of accurate
cartography. Satellite observations can cover the land, the atmosphere, and the oceans.
Remote-sensing satellites may be equipped with passive instrumentation such as infrared or
cameras for imaging the visible or active instrumentation such as radar. Generally, such satellites are
non-geostationary satellites, i.e., they move at a certain speed along orbits inclined with respect to the
Earth’s equatorial plane, often in polar orbit, at low or medium altitude, Low Earth Orbit (LEO) and
Medium Earth Orbit (MEO), thus covering the entire Earth’s surface in a certain scan time (properly
called ’temporal resolution’), i.e., in a certain number of orbits around the Earth.
The first remote-sensing satellites were the American NASA/USGS Landsat Program;
subsequently, the European: ENVISAT (ENVironmental SATellite), ERS (European Remote-Sensing
satellite), RapidEye, the French SPOT (Satellite Pour l’Observation de laTerre), and the Canadian
RADARSAT satellites were launched. The IKONOS, QuickBird, and GeoEye-1 satellites were
dedicated to cartography. The WorldView-1 and WorldView-2 satellites and the COSMO-SkyMed
system are more recent. The latest generation are the low payloads called Small Satellites, e.g., the
Chinese BuFeng-1 and Fengyun-3 series.
Also, Global Navigation Satellite Systems (GNSSs) have captured the attention of researchers
worldwide for a multitude of Earth monitoring and exploration applications. On the other hand,
over the past 40 years, GNSSs have become an essential part of many human activities. As is widely
noted, there are currently four fully operational GNSSs; two of these were developed for military
purposes (American NAVstar GPS and Russian GLONASS), whilst two others were developed for
civil purposes such as the Chinese BeiDou satellite navigation system (BDS) and the European
Galileo. In addition, many other regional GNSSs, such as the South Korean Regional Positioning
System (KPS), the Japanese quasi-zenital satellite system (QZSS), and the Indian Regional Navigation
Satellite System (IRNSS/NavIC), will become available in the next few years, which will have
enormous potential for scientific applications and geomatics professionals.
In addition to their traditional role of providing global positioning, navigation, and timing (PNT)
information, GNSS navigation signals are now being used in new and innovative ways. Across the
globe, new fields of scientific study are opening up to examine how signals can provide information
about the characteristics of the atmosphere and even the surfaces from which they are reflected before
being collected by a receiver.
EO researchers monitor global environmental systems using in situ and remote monitoring tools.
Their findings provide tools to support decision makers in various areas of interest, from security
to the natural environment. GNSS signals are considered an important new source of information
because they are a free, real-time, and globally available resource for the EO community
Evolution of High Throughput Satellite Systems: Vision, Requirements, and Key Technologies
High throughput satellites (HTS), with their digital payload technology, are
expected to play a key role as enablers of the upcoming 6G networks. HTS are
mainly designed to provide higher data rates and capacities. Fueled by
technological advancements including beamforming, advanced modulation
techniques, reconfigurable phased array technologies, and electronically
steerable antennas, HTS have emerged as a fundamental component for future
network generation. This paper offers a comprehensive state-of-the-art of HTS
systems, with a focus on standardization, patents, channel multiple access
techniques, routing, load balancing, and the role of software-defined
networking (SDN). In addition, we provide a vision for next-satellite systems
that we named as extremely-HTS (EHTS) toward autonomous satellites supported by
the main requirements and key technologies expected for these systems. The EHTS
system will be designed such that it maximizes spectrum reuse and data rates,
and flexibly steers the capacity to satisfy user demand. We introduce a novel
architecture for future regenerative payloads while summarizing the challenges
imposed by this architecture
- …