198 research outputs found

    Hybrid Satellite-Terrestrial Communication Networks for the Maritime Internet of Things: Key Technologies, Opportunities, and Challenges

    Get PDF
    With the rapid development of marine activities, there has been an increasing number of maritime mobile terminals, as well as a growing demand for high-speed and ultra-reliable maritime communications to keep them connected. Traditionally, the maritime Internet of Things (IoT) is enabled by maritime satellites. However, satellites are seriously restricted by their high latency and relatively low data rate. As an alternative, shore & island-based base stations (BSs) can be built to extend the coverage of terrestrial networks using fourth-generation (4G), fifth-generation (5G), and beyond 5G services. Unmanned aerial vehicles can also be exploited to serve as aerial maritime BSs. Despite of all these approaches, there are still open issues for an efficient maritime communication network (MCN). For example, due to the complicated electromagnetic propagation environment, the limited geometrically available BS sites, and rigorous service demands from mission-critical applications, conventional communication and networking theories and methods should be tailored for maritime scenarios. Towards this end, we provide a survey on the demand for maritime communications, the state-of-the-art MCNs, and key technologies for enhancing transmission efficiency, extending network coverage, and provisioning maritime-specific services. Future challenges in developing an environment-aware, service-driven, and integrated satellite-air-ground MCN to be smart enough to utilize external auxiliary information, e.g., sea state and atmosphere conditions, are also discussed

    Downlink Coverage and Rate Analysis of Low Earth Orbit Satellite Constellations Using Stochastic Geometry

    Get PDF
    As low Earth orbit (LEO) satellite communication systems are gaining increasing popularity, new theoretical methodologies are required to investigate such networks' performance at large. This is because deterministic and location-based models that have previously been applied to analyze satellite systems are typically restricted to support simulations only. In this paper, we derive analytical expressions for the downlink coverage probability and average data rate of generic LEO networks, regardless of the actual satellites' locality and their service area geometry. Our solution stems from stochastic geometry, which abstracts the generic networks into uniform binomial point processes. Applying the proposed model, we then study the performance of the networks as a function of key constellation design parameters. Finally, to fit the theoretical modeling more precisely to real deterministic constellations, we introduce the effective number of satellites as a parameter to compensate for the practical uneven distribution of satellites on different latitudes. In addition to deriving exact network performance metrics, the study reveals several guidelines for selecting the design parameters for future massive LEO constellations, e.g., the number of frequency channels and altitude.Comment: Accepted for publication in the IEEE Transactions on Communications in April 202

    Database-assisted spectrum sharing in satellite communications:A survey

    Get PDF
    This survey paper discusses the feasibility of sharing the spectrum between satellite telecommunication networks and terrestrial and other satellite networks on the basis of a comprehensive study carried out as part of the European Space Agency's (ESA) Advanced Research in Telecommunications Systems (ARTES) programme. The main area of investigation is the use of spectrum databases to enable a controlled sharing environment. Future satellite systems can largely benefit from the ability to access spectrum bands other than the dedicated licensed spectrum band. Potential spectrum sharing scenarios are classified as: a) secondary use of the satellite spectrum by terrestrial systems, b) satellite system as a secondary user of spectrum, c) extension of a terrestrial network by using the satellite network, and d) two satellite systems sharing the same spectrum. We define practical use cases for each scenario and identify suitable techniques. The proposed scenarios and use cases cover several frequency bands and satellite orbits. Out of all the scenarios reviewed, owing to the announcement of many different mega-constellation satellite networks, we focus on analysing the feasibility of spectrum sharing between geostationary orbit (GSO) and non-geostationary orbit (NGSO) satellite systems. The performance is primarily analysed on the basis of widely accepted recommendations of the Radiocommunications Sector of the International Telecommunications Union (ITU-R). Finally, future research directions are identified

    Evolution of High Throughput Satellite Systems: Vision, Requirements, and Key Technologies

    Full text link
    High throughput satellites (HTS), with their digital payload technology, are expected to play a key role as enablers of the upcoming 6G networks. HTS are mainly designed to provide higher data rates and capacities. Fueled by technological advancements including beamforming, advanced modulation techniques, reconfigurable phased array technologies, and electronically steerable antennas, HTS have emerged as a fundamental component for future network generation. This paper offers a comprehensive state-of-the-art of HTS systems, with a focus on standardization, patents, channel multiple access techniques, routing, load balancing, and the role of software-defined networking (SDN). In addition, we provide a vision for next-satellite systems that we named as extremely-HTS (EHTS) toward autonomous satellites supported by the main requirements and key technologies expected for these systems. The EHTS system will be designed such that it maximizes spectrum reuse and data rates, and flexibly steers the capacity to satisfy user demand. We introduce a novel architecture for future regenerative payloads while summarizing the challenges imposed by this architecture

    SRML: Space Radio Machine Learning

    Get PDF
    Space-based communications systems to be employed by future artificial satellites, or spacecraft during exploration missions, can potentially benefit from software-defined radio adaptation capabilities. Multiple communication requirements could potentially compete for radio resources, whose availability of which may vary during the spacecraft\u27s operational life span. Electronic components are prone to failure, and new instructions will eventually be received through software updates. Consequently, these changes may require a whole new set of near-optimal combination of parameters to be derived on-the-fly without instantaneous human interaction or even without a human in-the-loop. Thus, achieving a sufficiently set of radio parameters can be challenging, especially when the communication channels change dynamically due to orbital dynamics as well as atmospheric and space weather-related impairments. This dissertation presents an analysis and discussion regarding novel algorithms proposed in order to enable a cognition control layer for adaptive communication systems operating in space using an architecture that merges machine learning techniques employing wireless communication principles. The proposed cognitive engine proof-of-concept reasons over time through an efficient accumulated learning process. An implementation of the conceptual design is expected to be delivered to the SDR system located on the International Space Station as part of an experimental program. To support the proposed cognitive engine algorithm development, more realistic satellite-based communications channels are proposed along with rain attenuation synthesizers for LEO orbits, channel state detection algorithms, and multipath coefficients function of the reflector\u27s electrical characteristics. The achieved performance of the proposed solutions are compared with the state-of-the-art, and novel performance benchmarks are provided for future research to reference

    The Role of Physical Layer Security in Satellite-Based Networks

    Full text link
    In the coming years, 6G will revolutionize the world with a large amount of bandwidth, high data rates, and extensive coverage in remote and rural areas. These goals can only be achieved by integrating terrestrial networks with non-terrestrial networks. On the other hand, these advancements are raising more concerns than other wireless links about malicious attacks on satellite-terrestrial links due to their openness. Over the years, physical layer security (PLS) has emerged as a good candidate to deal with security threats by exploring the randomness of wireless channels. In this direction, this paper reviews how PLS methods are implemented in satellite communications. Firstly, we discuss the ongoing research on satellite-based networks by highlighting the key points in the literature. Then, we revisit the research activities on PLS in satellite-based networks by categorizing the different system architectures. Finally, we highlight research directions and opportunities to leverage the PLS in future satellite-based networks

    Hybrid satellite–terrestrial networks toward 6G : key technologies and open issues

    Get PDF
    Future wireless networks will be required to provide more wireless services at higher data rates and with global coverage. However, existing homogeneous wireless networks, such as cellular and satellite networks, may not be able to meet such requirements individually, especially in remote terrain, including seas and mountains. One possible solution is to use diversified wireless networks that can exploit the inter-connectivity between satellites, aerial base stations (BSs), and terrestrial BSs over inter-connected space, ground, and aerial networks. Hence, enabling wireless communication in one integrated network has attracted both the industry and the research fraternities. In this work, we provide a comprehensive survey of the most recent work on hybrid satellite–terrestrial networks (HSTNs), focusing on system architecture, performance analysis, design optimization, and secure communication schemes for different cooperative and cognitive HSTN network architectures. Different key technologies are compared. Based on this comparison, several open issues for future research are discussed

    Evaluation of MU-MIMO Digital Beamforming Algorithms in B5G/6G LEO Satellite Systems

    Get PDF
    Satellite Communication (SatCom) systems will be a key component of 5G and 6G networks to achieve the goal of providing unlimited and ubiquitous communications and deploying smart and sustainable networks. To meet the ever-increasing demand for higher throughput in 5G and beyond, aggressive frequency reuse schemes (i.e., full frequency reuse), combined with digital beamforming techniques to cope with the massive co-channel interference, are recognized as a key solution. Aimed at (i) eliminating the joint optimization problem among the beamforming vectors of all users, (ii) splitting it into distinct ones, and (iii) finding a closed-form solution, we propose a beamforming algorithm based on maximizing the users' Signal-to-Leakage-and-Noise Ratio (SLNR) served by a Low Earth Orbit (LEO) satellite. We investigate and assess the performance of several beamforming algorithms, including both those based on Channel State Information (CSI) at the transmitter, i.e., Minimum Mean Square Error (MMSE) and Zero-Forcing (ZF), and those only requiring the users' locations, i.e., Switchable Multi-Beam (MB). Through a detailed numerical analysis, we provide a thorough comparison of the performance in terms of per-user achievable spectral efficiency of the aforementioned beamforming schemes, and we show that the proposed SLNR beamforming technique is able to outperform both MMSE and ZF schemes in the presented SatCom scenario
    • …
    corecore