22 research outputs found
System Modelling and Design Aspects of Next Generation High Throughput Satellites
Future generation wireless networks are targeting the convergence of fixed,
mobile and broadcasting systems with the integration of satellite and
terrestrial systems towards utilizing their mutual benefits. Satellite
Communications (Sat- Com) is envisioned to play a vital role to provide
integrated services seamlessly over heterogeneous networks. As compared to
terrestrial systems, the design of SatCom systems require a different approach
due to differences in terms of wave propagation, operating frequency, antenna
structures, interfering sources, limitations of onboard processing, power
limitations and transceiver impairments. In this regard, this letter aims to
identify and discuss important modeling and design aspects of the next
generation High Throughput Satellite (HTS) systems. First, communication models
of HTSs including the ones for multibeam and multicarrier satellites, multiple
antenna techniques, and for SatCom payloads and antennas are highlighted and
discussed. Subsequently, various design aspects of SatCom transceivers
including impairments related to the transceiver, payload and channel, and
traffic-based coverage adaptation are presented. Finally, some open topics for
the design of next generation HTSs are identified and discussed.Comment: submitted to IEEE Journa
Precoding in multibeam satellite communications: present and future challenges
Whenever multibeam satellite systems target very aggressive frequency reuse in their coverage area, inter-beam interference becomes the major obstacle for increasing the overall system throughput. As a matter of fact, users located at the beam edges suffer from a very large interference for even a moderately aggressive planning of reuse-2. Although solutions for inter-beam interference management have been investigated at the satellite terminal, it turns out that the performance improvement does not justify the increased terminal complexity and cost. In this article, we pay attention to interference mitigation techniques that take place at the transmitter (i.e. the gateway). Based on this understanding, we provide our vision on advanced precoding techniques and user clustering methods for multibeam broadband fixed satellite communications. We also discuss practical challenges to deploy precoding schemes and the support introduced in the recently published DVB-S2X standard. Future challenges for novel configurations employing precoding are also provided
Hardware Precoding Demonstration in Multi-Beam UHTS Communications under Realistic Payload Characteristics
In this paper, we present a new hardware test-bed to demonstrate closed-loop precoded communications for interference mitigation in multi-beam ultra high throughput satellite systems under realistic payload and channel impairments. We build the test-bed to demonstrate a real-time channel aided precoded transmission under realistic conditions such as the power constraints and satellite-payload non-linearities. We develop a scalable architecture of an SDR platform with the DVB-S2X piloting. The SDR platform consists of two parts: analog-to-digital (ADC) and digital-to-analog (DAC) converters preceded by radio frequency (RF) front-end and Field-Programmable Gate Array (FPGA) backend. The former introduces realistic impairments in the transmission chain such as carrier frequency and phase misalignments, quantization noise of multichannel ADC and DAC and non-linearities of RF components. It allows evaluating the performance of the precoded transmission in a more realistic environment rather than using only numerical simulations. We benchmark the performance of the communication standard in realistic channel scenarios, evaluate received signal
SNR, and measure the actual channel throughput using LDPC codes
Cognitive-Based Solutions to Spectrum Issues in Future Satellite Communication Systems
With particular attention to Satellite Communications (SatComs), cognitive-based solutions are investigated. With cognitive-based solutions we refer to all those techniques that aim at improving spectrum utilization of the available spectrum and rely on the knowledge of the environment in which the systems operate. As a matter of fact, an improved spectrum utilization enables higher throughput capacities that will satisfy the future markets and demands of an increasingly connected world.
Throughout the thesis, several techniques are proposed, developed, and assessed with respect to specific scenarios of interest. Particular focus has been put on spectrum awareness techniques for system coexistence, and on spectrum exploitation techniques for an improved efficiency in terms of resource utilization
Revolutionizing Future Connectivity: A Contemporary Survey on AI-empowered Satellite-based Non-Terrestrial Networks in 6G
Non-Terrestrial Networks (NTN) are expected to be a critical component of 6th
Generation (6G) networks, providing ubiquitous, continuous, and scalable
services. Satellites emerge as the primary enabler for NTN, leveraging their
extensive coverage, stable orbits, scalability, and adherence to international
regulations. However, satellite-based NTN presents unique challenges, including
long propagation delay, high Doppler shift, frequent handovers, spectrum
sharing complexities, and intricate beam and resource allocation, among others.
The integration of NTNs into existing terrestrial networks in 6G introduces a
range of novel challenges, including task offloading, network routing, network
slicing, and many more. To tackle all these obstacles, this paper proposes
Artificial Intelligence (AI) as a promising solution, harnessing its ability to
capture intricate correlations among diverse network parameters. We begin by
providing a comprehensive background on NTN and AI, highlighting the potential
of AI techniques in addressing various NTN challenges. Next, we present an
overview of existing works, emphasizing AI as an enabling tool for
satellite-based NTN, and explore potential research directions. Furthermore, we
discuss ongoing research efforts that aim to enable AI in satellite-based NTN
through software-defined implementations, while also discussing the associated
challenges. Finally, we conclude by providing insights and recommendations for
enabling AI-driven satellite-based NTN in future 6G networks.Comment: 40 pages, 19 Figure, 10 Tables, Surve
Recommended from our members
Array Architectures and Physical Layer Design for Millimeter-Wave Communications Beyond 5G
Ever increasing demands in mobile data rates have resulted in exploration of millimeter-wave (mmW) frequencies for the next generation (5G) wireless networks. Communications at mmW frequencies is presented with two keys challenges. Firstly, high propagation loss requires base stations (BSs) and user equipment (UEs) to use a large number of antennas and narrow beams to close the link with sufficient received signal power. Consequently, communications using narrow beams create a new challenge in channel estimation and link establishment based on fine angular probing. Current mmW system use analog phased arrays that can probe only one angle at the time which results in high latency during link establishment and channel tracking. It is desirable to design low latency beam training by exploring both physical layer designs and array architectures that could replace current 5G approaches and pave the way to the communications for frequency bands in higher mmW band and sub-THz region where larger antenna arrays and communications bandwidth can be exploited. To this end, we propose a novel signal processing techniques exploiting unique properties of mmW channel, and show both theoretically, in simulation and experiments its advantages over conventional approaches. Secondly, we explore different array architecture design and analyze their trade-offs between spectral efficiency and power consumption and area. For comprehensive comparison, we have developed a methodology for optimal design of system parameters for different array architecture candidates based on the spectral efficiency target, and use these parameters to estimate the array area and power consumption based on the circuits reported in the literature. We show that the hybrid analog and digital architectures have severe scalability concerns in radio frequency signal distribution with increased array size and spatial multiplexing levels, while the fully-digital array architectures have the best performance and power/area trade-offs.The developed approaches are based on a cross-disciplinary research that combines innovation in model based signal processing, machine learning, and radio hardware. This work is the first to apply compressive sensing (CS), a signal processing tool that exploits sparsity of mmW channel model, to accelerate beam training of mmW cellular system. The algorithm is designed to address practical issues including the requirement of cell discovery and synchronization that involves estimation of angular channel together with carrier frequency offset and timing offsets. We have analyzed the algorithm performance in the 5G compliant simulation and showed that an order of magnitude saving is achieved in initial access latency for the desired channel estimation accuracy. Moreover, we are the first to develop and implement a neural network assisted compressive beam alignment to deal with hardware impairments in mmW radios. We have used 60GHz mmW testbed to perform experiments and show that neural networks approach enhances alignment rate compared to CS. To further accelerate beam training, we proposed a novel frequency selective probing beams using the true-time-delay (TTD) analog array architecture. Our approach utilizes different subcarriers to scan different directions, and achieves a single-shot beam alignment, the fastest approach reported to date. Our comprehensive analysis of different array architectures and exploration of emerging architectures enabled us to develop an order of magnitude faster and energy efficient approaches for initial access and channel estimation in mmW systems