142 research outputs found

    Role of satellite communications in 5G ecosystem: perspectives and challenges

    Get PDF
    The next generation of mobile radio communication systems – so-called 5G – will provide some major changes to those generations to date. The ability to cope with huge increases in data traffic at reduced latencies and improved quality of user experience together with a major reduction in energy usage are big challenges. In addition, future systems will need to embody connections to billions of objects – the so-called Internet of Things (IoT) which raises new challenges.Visions of 5G are now available from regions across the world and research is ongoing towards new standards. The consensus is a flatter architecture that adds a dense network of small cells operating in the millimetre wave bands and which are adaptable and software controlled. But what is the place for satellites in such a vision? The chapter examines several potential roles for satellites in 5G including coverage extension, IoT, providing resilience, content caching and multi-cast, and the integrated architecture. Furthermore, the recent advances in satellite communications together with the challenges associated with the use of satellite in the integrated satellite-terrestrial architecture are also discussed

    Space-Based Information Infrastructure Architecture for Broadband Services

    Get PDF
    This study addressed four tasks: (1) identify satellite-addressable information infrastructure markets; (2) perform network analysis for space-based information infrastructure; (3) develop conceptual architectures; and (4) economic assessment of architectures. The report concludes that satellites will have a major role in the national and global information infrastructure, requiring seamless integration between terrestrial and satellite networks. The proposed LEO, MEO, and GEO satellite systems have satellite characteristics that vary widely. They include delay, delay variations, poorer link quality and beam/satellite handover. The barriers against seamless interoperability between satellite and terrestrial networks are discussed. These barriers are the lack of compatible parameters, standards and protocols, which are presently being evaluated and reduced

    From the oceans to the cloud: Opportunities and challenges for data, models, computation and workflows.

    Get PDF
    © The Author(s), 2019. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Vance, T. C., Wengren, M., Burger, E., Hernandez, D., Kearns, T., Medina-Lopez, E., Merati, N., O'Brien, K., O'Neil, J., Potemrag, J. T., Signell, R. P., & Wilcox, K. From the oceans to the cloud: Opportunities and challenges for data, models, computation and workflows. Frontiers in Marine Science, 6(211), (2019), doi:10.3389/fmars.2019.00211.Advances in ocean observations and models mean increasing flows of data. Integrating observations between disciplines over spatial scales from regional to global presents challenges. Running ocean models and managing the results is computationally demanding. The rise of cloud computing presents an opportunity to rethink traditional approaches. This includes developing shared data processing workflows utilizing common, adaptable software to handle data ingest and storage, and an associated framework to manage and execute downstream modeling. Working in the cloud presents challenges: migration of legacy technologies and processes, cloud-to-cloud interoperability, and the translation of legislative and bureaucratic requirements for “on-premises” systems to the cloud. To respond to the scientific and societal needs of a fit-for-purpose ocean observing system, and to maximize the benefits of more integrated observing, research on utilizing cloud infrastructures for sharing data and models is underway. Cloud platforms and the services/APIs they provide offer new ways for scientists to observe and predict the ocean’s state. High-performance mass storage of observational data, coupled with on-demand computing to run model simulations in close proximity to the data, tools to manage workflows, and a framework to share and collaborate, enables a more flexible and adaptable observation and prediction computing architecture. Model outputs are stored in the cloud and researchers either download subsets for their interest/area or feed them into their own simulations without leaving the cloud. Expanded storage and computing capabilities make it easier to create, analyze, and distribute products derived from long-term datasets. In this paper, we provide an introduction to cloud computing, describe current uses of the cloud for management and analysis of observational data and model results, and describe workflows for running models and streaming observational data. We discuss topics that must be considered when moving to the cloud: costs, security, and organizational limitations on cloud use. Future uses of the cloud via computational sandboxes and the practicalities and considerations of using the cloud to archive data are explored. We also consider the ways in which the human elements of ocean observations are changing – the rise of a generation of researchers whose observations are likely to be made remotely rather than hands on – and how their expectations and needs drive research towards the cloud. In conclusion, visions of a future where cloud computing is ubiquitous are discussed.This is PMEL contribution 4873

    Rapid Response Command and Control (R2C2): a systems engineering analysis of scaleable communications for Regional Combatant Commanders

    Get PDF
    Includes supplementary materialDisaster relief operations, such as the 2005 Tsunami and Hurricane Katrina, and wartime operations, such as Operation Enduring Freedom and Operation Iraqi Freedom, have identified the need for a standardized command and control system interoperable among Joint, Coalition, and Interagency entities. The Systems Engineering Analysis Cohort 9 (SEA-9) Rapid Response Command and Control (R2C2) integrated project team completed a systems engineering (SE) process to address the military’s command and control capability gap. During the process, the R2C2 team conducted mission analysis, generated requirements, developed and modeled architectures, and analyzed and compared current operational systems versus the team’s R2C2 system. The R2C2 system provided a reachback capability to the Regional Combatant Commander’s (RCC) headquarters, a local communications network for situational assessments, and Internet access for civilian counterparts participating in Humanitarian Assistance/Disaster Relief operations. Because the team designed the R2C2 system to be modular, analysis concluded that the R2C2 system was the preferred method to provide the RCC with the required flexibility and scalability to deliver a rapidly deployable command and control capability to perform the range of military operations

    Dutkat: A Privacy-Preserving System for Automatic Catch Documentation and Illegal Activity Detection in the Fishing Industry

    Get PDF
    United Nations' Sustainable Development Goal 14 aims to conserve and sustainably use the oceans and their resources for the benefit of people and the planet. This includes protecting marine ecosystems, preventing pollution, and overfishing, and increasing scientific understanding of the oceans. Achieving this goal will help ensure the health and well-being of marine life and the millions of people who rely on the oceans for their livelihoods. In order to ensure sustainable fishing practices, it is important to have a system in place for automatic catch documentation. This thesis presents our research on the design and development of Dutkat, a privacy-preserving, edge-based system for catch documentation and detection of illegal activities in the fishing industry. Utilising machine learning techniques, Dutkat can analyse large amounts of data and identify patterns that may indicate illegal activities such as overfishing or illegal discard of catch. Additionally, the system can assist in catch documentation by automating the process of identifying and counting fish species, thus reducing potential human error and increasing efficiency. Specifically, our research has consisted of the development of various components of the Dutkat system, evaluation through experimentation, exploration of existing data, and organization of machine learning competitions. We have also implemented it from a compliance-by-design perspective to ensure that the system is in compliance with data protection laws and regulations such as GDPR. Our goal with Dutkat is to promote sustainable fishing practices, which aligns with the Sustainable Development Goal 14, while simultaneously protecting the privacy and rights of fishing crews

    Multi-attribute tradespace exploration for survivability

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 235-249).Survivability is the ability of a system to minimize the impact of a finite-duration disturbance on value delivery (i.e., stakeholder benefit at cost), achieved through (1) the reduction of the likelihood or magnitude of a disturbance, (2) the satisfaction of a minimally acceptable level of value delivery during and after a disturbance, and/or (3) a timely recovery. Traditionally specified as a requirement in military systems, survivability is an increasingly important consideration for all engineering systems given the proliferation of natural and artificial threats. Although survivability is an emergent system property that arises from interactions between a system and its environment, conventional approaches to survivability engineering are reductionist in nature. Furthermore, current methods neither accommodate dynamic threat environments nor facilitate stakeholder communication for conducting trade-offs among system lifecycle cost, mission utility, and operational survivability. Multi-Attribute Tradespace Exploration (MATE) for Survivability is introduced as a system analysis methodology to improve the generation and evaluation of survivable alternatives during conceptual design. MATE for Survivability applies decision theory to the parametric modeling of thousands of design alternatives across representative distributions of disturbance environments. To improve the generation of survivable alternatives, seventeen empirically-validated survivability design principles are introduced. The general set of design principles allows the consideration of structural and behavioral strategies for mitigating the impact of disturbances over the lifecycle of a given encounter.(cont.) To improve the evaluation of survivability, value-based metrics are introduced for the assessment of survivability as a dynamic, continuous, and path-dependent system property. Two of these metrics, time-weighted average utility loss and threshold availability, are used to evaluate survivability based on the relationship between stochastic utility trajectories of system state and stakeholder expectations across nominal and perturbed environments. Finally, the survivability "tear(drop)" tradespace is introduced to enable the identification of inherently survivable architectures that efficiently balance performance metrics of cost, utility, and survivability. The internal validity and prescriptive value of the design principles, metrics, and tradespaces comprising MATE for Survivability are established through applications to the designs of an orbital transfer vehicle and a satellite radar system.by Matthew G. Richards.Ph.D

    Towards 6G Through SDN and NFV-Based Solutions for Terrestrial and Non-Terrestrial Networks

    Get PDF
    As societal needs continue to evolve, there has been a marked rise in a wide variety of emerging use cases that cannot be served adequately by existing networks. For example, increasing industrial automation has not only resulted in a massive rise in the number of connected devices, but has also brought forth the need for remote monitoring and reconnaissance at scale, often in remote locations characterized by a lack of connectivity options. Going beyond 5G, which has largely focused on enhancing the quality-of-experience for end devices, the next generation of wireless communications is expected to be centered around the idea of "wireless ubiquity". The concept of wireless ubiquity mandates that the quality of connectivity is not only determined by classical metrics such as throughput, reliability, and latency, but also by the level of coverage offered by the network. In other words, the upcoming sixth generation of wireless communications should be characterized by networks that exhibit high throughput and reliability with low latency, while also providing robust connectivity to a multitude of devices spread across the surface of the Earth, without any geographical constraints. The objective of this PhD thesis is to design novel architectural solutions for the upcoming sixth generation of cellular and space communications systems with a view to enabling wireless ubiquity with software-defined networking and network function virtualization at its core. Towards this goal, this thesis introduces a novel end-to-end system architecture for cellular communications characterized by innovations such as the AirHYPE wireless hypervisor. Furthermore, within the cellular systems domain, solutions for radio access network design with software-defined mobility management, and containerized core network design optimization have also been presented. On the other hand, within the space systems domain, this thesis introduces the concept of the Internet of Space Things (IoST). IoST is a novel cyber-physical system centered on nanosatellites and is capable of delivering ubiquitous connectivity for a wide variety of use cases, ranging from monitoring and reconnaissance to in-space backhauling. In this direction, contributions relating to constellation design, routing, and automatic network slicing form a key aspect of this thesis.Ph.D
    • …
    corecore