82 research outputs found

    Quasisynchronous LoRa for LEO nanosatellite communications

    Get PDF
    Perfect synchronization in LoRa communications between Low Earth Orbit (LEO) satellites and ground base stations is still challenging, despite the potential use of atomic clocks in LEO satellites, which offer high precision. Even by incorporating atomic clocks in LEO satellites, their inherent precision can be leveraged to enhance the overall synchronization process, perfect synchronization is infeasible due to a combination of factors such as signal propagation delay, Doppler effects, clock drift and atmospheric effects. These challenges require the development of advanced synchronization techniques and algorithms to mitigate their effects and ensure reliable communication from / to LEO satellites. However, maintaining acceptable levels of synchronization rather than striving for perfection, quasisynchronous (QS) communication can be adopted which maintains communication reliability, improves resource utilization, reduces power consumption, and ensures scalability as more devices join the communication. Overall, QS communication offers a practical, adaptive, and robust solution that enables LEO satellite communications to support the growing demands of IoT applications and global connectivity. In our investigation, we explore different chip waveforms such as rectangular and raised cosine. Furthermore, for the first time, we study the Symbol Error Rate (SER) performance of QS LoRa communication, for different spreading factors (SF), over Additive White Gaussian Noise (AWGN) channels.IEEE Communications Societ

    FEMTOSAT-Based Air Quality Monitoring: Leveraging Satellite Data and LoRa Communication for Improved AQI Predictions

    Get PDF
    The exponential growth of the global population has given rise to an alarming surge in air pollution levels, casting profound repercussions on economies, ecosystems, and human well-being. The predominant method for assessing air quality involves the deployment of sensors atop buildings at regular intervals. However, this approach faces significant drawbacks, including heightened energy consumption associated with each building's sensor infrastructure and its inapplicability in sparsely populated rural areas. Addressing these limitations, the utilization of FEMTOSAT technology emerges as a solution, capitalizing on satellite data for autonomous air pollution monitoring, analysis, and mitigation. Amidst the evolving scientific landscape, the integration of Low Power Wide Area Network (LoRa) technology assumes a pivotal role within the Internet of Things (IoT), facilitating long-range data communication with minimal power consumption. In this context, LoRa communication serves as the conduit for transmitting and receiving data from satellites via RF signals. The satellite-derived environmental data, thus collected, serves as the foundation for computing the Air Quality Index (AQI) at specific locations, a critical metric that informs us about air quality conditions, whether pristine or contaminated. The AQI computation factors in various pollutants, including NO2, CO, O3, PM2.5, SO2, and PM10, all of which significantly influence air quality. This study employs a range of machine learning (ML) techniques, including time series analysis, linear regression, Support Vector Machines (SVM), and logistic regression, to predict and forecast AQI values. These models amalgamate AQI data from diverse sources, yielding robust and dependable AQI prediction models. Notably, modern sensor technology simplifies and enhances data collection accuracy. In the realm of environmental data analysis, only ML algorithms can grapple with the complexity of processing vast datasets to generate precise and trustworthy predictions. The incorporation of integrated sensors as payload in the Femto Sat mission epitomizes the mission's objectives. This system is characterized by its cost-effectiveness, lightweight design, durability, redundancy, and user-friendly interface, requiring minimal power consumption for operation

    Envisioning the Future Role of 3D Wireless Networks in Preventing and Managing Disasters and Emergency Situations

    Full text link
    In an era marked by unprecedented climatic upheavals and evolving urban landscapes, the role of advanced communication networks in disaster prevention and management is becoming increasingly critical. This paper explores the transformative potential of 3D wireless networks, an innovative amalgamation of terrestrial, aerial, and satellite technologies, in enhancing disaster response mechanisms. We delve into a myriad of use cases, ranging from large facility evacuations to wildfire management, underscoring the versatility of these networks in ensuring timely communication, real-time situational awareness, and efficient resource allocation during crises. We also present an overview of cutting-edge prototypes, highlighting the practical feasibility and operational efficacy of 3D wireless networks in real-world scenarios. Simultaneously, we acknowledge the challenges posed by aspects such as cybersecurity, cross-border coordination, and physical layer technological hurdles, and propose future directions for research and development in this domain

    Network Size Estimation for Direct-to-Satellite IoT

    Get PDF
    International audienceThe worldwide adoption of the Internet of things (IoT) depends on the massive deployment of sensor nodes and timely data collection. However, installing the required ground infrastructure in remote or inaccessible areas can be economically unattractive or unfeasible. Cost-effective nanosatellites deployed in low Earth orbits (LEO) are emerging as an alternative solution: on-board IoT gateways provide access to remote IoT devices, according to direct-to-satellite IoT (DtS-IoT) architectures. One of the main challenges of DtS-IoT is to devise communication protocols that scale to thousands of highly constrained devices served by likewise constrained orbiting gateways. In this paper, we tackle this issue by first estimating the (varying) size of the device set underneath the (mobile) nanosatellite footprint. Then, we demonstrate applicability of the estimation when used to intelligently throttle DtS-IoT access protocols. Since recent works have shown that MAC protocols improve the throughput and energy efficiency of a DtS-IoT network when a network size estimation is available, we present here a novel and computationally-efficient network size estimator in DtS-IoT: our optimistic collision information (OCI) based estimator. We evaluate OCI's effectiveness with extensive simulations of DtS-IoT scenarios. Results show that when using network size estimations, the scalability of a frame slotted Aloha-based DtS-IoT network is boosted 8-fold, serving up to 4 × 10 3 devices, without energy efficiency penalties. We also show the effectiveness of the OCI mechanism given realistic detection ratios and demonstrate its low computational cost implementation, making it a strong candidate for network estimation in DtS-IoT

    Five Facets of 6G: Research Challenges and Opportunities

    Full text link
    Whilst the fifth-generation (5G) systems are being rolled out across the globe, researchers have turned their attention to the exploration of radical next-generation solutions. At this early evolutionary stage we survey five main research facets of this field, namely {\em Facet~1: next-generation architectures, spectrum and services, Facet~2: next-generation networking, Facet~3: Internet of Things (IoT), Facet~4: wireless positioning and sensing, as well as Facet~5: applications of deep learning in 6G networks.} In this paper, we have provided a critical appraisal of the literature of promising techniques ranging from the associated architectures, networking, applications as well as designs. We have portrayed a plethora of heterogeneous architectures relying on cooperative hybrid networks supported by diverse access and transmission mechanisms. The vulnerabilities of these techniques are also addressed and carefully considered for highlighting the most of promising future research directions. Additionally, we have listed a rich suite of learning-driven optimization techniques. We conclude by observing the evolutionary paradigm-shift that has taken place from pure single-component bandwidth-efficiency, power-efficiency or delay-optimization towards multi-component designs, as exemplified by the twin-component ultra-reliable low-latency mode of the 5G system. We advocate a further evolutionary step towards multi-component Pareto optimization, which requires the exploration of the entire Pareto front of all optiomal solutions, where none of the components of the objective function may be improved without degrading at least one of the other components

    A Survey on Non-Geostationary Satellite Systems: The Communication Perspective

    Get PDF
    The next phase of satellite technology is being characterized by a new evolution in non-geostationary orbit (NGSO) satellites, which conveys exciting new communication capabilities to provide non-terrestrial connectivity solutions and to support a wide range of digital technologies from various industries. NGSO communication systems are known for a number of key features such as lower propagation delay, smaller size, and lower signal losses in comparison to the conventional geostationary orbit (GSO) satellites, which can potentially enable latency-critical applications to be provided through satellites. NGSO promises a substantial boost in communication speed and energy efficiency, and thus, tackling the main inhibiting factors of commercializing GSO satellites for broader utilization. The promised improvements of NGSO systems have motivated this paper to provide a comprehensive survey of the state-of-the-art NGSO research focusing on the communication prospects, including physical layer and radio access technologies along with the networking aspects and the overall system features and architectures. Beyond this, there are still many NGSO deployment challenges to be addressed to ensure seamless integration not only with GSO systems but also with terrestrial networks. These unprecedented challenges are also discussed in this paper, including coexistence with GSO systems in terms of spectrum access and regulatory issues, satellite constellation and architecture designs, resource management problems, and user equipment requirements. Finally, we outline a set of innovative research directions and new opportunities for future NGSO research

    DESIGN MODULAR COMMAND AND DATA HANDLING SUBSYSTEM HARDWARE ARCHITECTURES

    Get PDF
    Over the past few years, On-Board Computing Systems for satellites have been facing a limited level of modularity. Modularity is the ability to reuse and reconstruct the system from a set of predesigned units, with minimal additional engineering effort. CDHS hardware systems currently available have a limited ability to scale with mission needs. This thesis addresses the integration of smaller form factor CDHS modules used for nanosatellites with the larger counterparts that are used for larger missions. In particular, the thesis discusses the interfacing between Modular Computer Systems based on Open Standard commonly used in large spacecrafts and PC/104 used for nanosatellites. It also aims to create a set of layers that would represent a hardware library of COTS-like modules. At the beginning, a review of related and previous work has been done to identify the gaps in previous studies and understand more about Modular Computer Systems based on Open Standard commonly used in large spacecrafts, such as cPCI Serial Space and SpaceVPX. Next, the design requirements have been set to achieve this thesis objectives, which included conducting a prestudy of system alternatives before creating a modular CDHS hardware architecture which was later tested. After, the hardware suitable for this architecture based on the specified requirements was chosen and the PCB was designed based on global standards. Later, several functional tests and communication tests were conducted to assess the practicality of the proposed architecture. Finally, thermal vacuum testing was done on one of the architecture’s layers to test its ability to withstand the space environment, with the aim to perform the vibration testing of the full modular architecture in the future. The aim of this thesis has been achieved after going through several tests, comparing between interfaces, and understanding the process of interfacing between different levels of the CDHS. The findings of this study pave the way for future research in the field and offer valuable insights that could contribute to the development of modular architectures for other satellite subsystems

    Futurecasting ecological research: the rise of technoecology

    Get PDF
    Increasingly complex research questions and global challenges (e.g., climate change and biodiversity loss) are driving rapid development, refinement, and uses of technology in ecology. This trend is spawning a distinct sub‐discipline, here termed “technoecology.” We highlight recent ground‐breaking and transformative technological advances for studying species and environments: bio‐batteries, low‐power and long‐range telemetry, the Internet of things, swarm theory, 3D printing, mapping molecular movement, and low‐power computers. These technologies have the potential to revolutionize ecology by providing “next‐generation” ecological data, particularly when integrated with each other, and in doing so could be applied to address a diverse range of requirements (e.g., pest and wildlife management, informing environmental policy and decision making). Critical to technoecology\u27s rate of advancement and uptake by ecologists and environmental managers will be fostering increased interdisciplinary collaboration. Ideally, such partnerships will span the conception, implementation, and enhancement phases of ideas, bridging the university, public, and private sectors
    corecore