200,680 research outputs found

    Amazon Web Services (AWS) Cloud Platform for Satellite Data Processing

    Get PDF
    As part of NOAA’s Environmental Satellite Processing and Distribution System (ESPDS) program, Solers created a cloud platform for satellite data management and processing. It consists of Enterprise Data Management (EDM) and Enterprise Product Generation (EPG) services, hosted in an Amazon Web Services (AWS) cloud environment, leveraging AWS cloud services and existing NOAA product generation algorithms. While this cloud platform was developed in the context of NOAA/NESDIS satellite data management and processing requirements, it also has tremendous applicability and cost effectiveness for small satellite data management and processing needs. An attractive method for ingesting data from small satellites is the AWS Ground Station. This can help small satellite operators save on costs of real estate, hardware/software, and labor to deploy and operate their own ground stations. The data is ingested via AWS-managed antennas, and made available for further processing in the AWS cloud using COTS RF/ baseband over IP transport services. Once this data has been ingested and made available, the flexible REST APIs from the EDM and EPG services in the AWS cloud make it easy and cost-effective for small satellite operators to catalog and process the data into consumable products, and make them available for access to end users

    From 5G to 6G: Revolutionizing Satellite Networks through TRANTOR Foundation

    Full text link
    5G technology will drastically change the way satellite internet providers deliver services by offering higher data speeds, massive network capacity, reduced latency, improved reliability and increased availability. A standardised 5G ecosystem will enable adapting 5G to satellite needs. The EU-funded TRANTOR project will seek to develop novel and secure satellite network management solutions that allow scaling up heterogeneous satellite traffic demands and capacities in a cost-effective and highly dynamic way. Researchers also target the development of flexible 6G non-terrestrial access architectures. The focus will be on the design of a multi-orbit and multi-band antenna for satellite user equipment (UE), as well as the development of gNodeB (gNB) and UE 5G non-terrestrial network equipment to support multi-connectivity

    Environmental Sustainability In Lagos Periphery Housing

    Get PDF
    Peri-urban residential settlements in Nigerian cities grew phenomenally as a result of rapid urbanisation. But the spontaneous housing settlements are enmeshed in development-driven urban challenges. This study examines the environmental sustainability of a typical peri-urban settlement in Lagos, Ikorodu. A combination of case study based examination and application of International Urban Sustainability Indicators List framework (IUSIL) was used to achieve an integrated research method. Data were collected through primary and secondary sources including observation, structured questionnaires, interview and satellite images. Two stage clusters sampling was used to select 384 household heads as the study population. Quantitative data was done using descriptive analysis while satellite image analysis was used for qualitative data. Investigations were carried out on the state of infrastructure, locational quality and the commuting pattern. Findings show fair environmental sustainability as evidenced by access to public water services, effective transportation system. There was residents’ satisfaction in areas of fresh water, less reliance on automobile and effective transportation system which culminates in less commuting hours in the study area. Dissatisfaction was recorded over inadequate drainage system, poor waste management, resulting in poor environmental quality. Also noted was deviation from the master plan, zoning was not adhered to thereby causing noise pollution due to encroachment of manufacturing industries on residential areas. These findings can be a useful template for all stakeholders in enabling sustainability of emerging settlements in Lagos cities’ periphery

    MOSAiC goes O2A - Arctic Expedition Data Flow from Observations to Archives

    Get PDF
    During the largest polar expedition in history starting in September 2019, the German research icebreaker Polarstern spends a whole year drifting with the ice through the Arctic Ocean. The MOSAiC expedition takes the closest look ever at the Arctic even throughout the polar winter to gain fundamental insights and most unique on-site data for a better understanding of global climate change. Hundreds of researchers from 20 countries are involved. Scientists will use the in situ gathered data instantaneously in near-real time modus as well as long afterwards all around the globe taking climate research to a completely new level. Hence, proper data management, sampling strategies beforehand, and monitoring actual data flow as well as processing, analysis and sharing of data during and long after the MOSAiC expedition are the most essential tools for scientific gain and progress. To prepare for that challenge we adapted and integrated the research data management framework O2A “Data flow from Observations to Archives” to the needs of the MOSAiC expedition on board Polarstern as well as on land for data storage and access at the Alfred Wegener Institute Computing and Data Center in Bremerhaven, Germany. Our O2A-framework assembles a modular research infrastructure comprising a collection of tools and services. These components allow researchers to register all necessary sensor metadata beforehand linked to automatized data ingestion and to ensure and monitor data flow as well as to process, analyze, and publish data to turn the most valuable and uniquely gained arctic data into scientific outcomes. The framework further allows for the integration of data obtained with discrete sampling devices into the data flow. These requirements have led us to adapt the generic and cost-effective framework O2A to enable, control, and access the flow of sensor observations to archives in a cloud-like infrastructure on board Polarstern and later on to land based repositories for international availability. Major roadblocks of the MOSAiC-O2A data flow framework are (i) the increasing number and complexity of research platforms, devices, and sensors, (ii) the heterogeneous interdisciplinary driven requirements towards, e. g., satellite data, sensor monitoring, in situ sample collection, quality assessment and control, processing, analysis and visualization, and (iii) the demand for near real time analyses on board as well as on land with limited satellite bandwidth. The key modules of O2A's digital research infrastructure established by AWI are implementing the FAIR principles: SENSORWeb, to register sensor applications and sampling devices and capture controlled meta data before and alongside any measurements in the field Data ingest, allowing researchers to feed data into storage systems and processing pipelines in a prepared and documented way, at best in controlled near real-time data streams Dashboards allowing researchers to find and access data and share and collaborate among partners Workspace enabling researchers to access and use data with research software utilizing a cloud-based virtualized infrastructure that allows researchers to analyze massive amounts of data on the spot Archiving and publishing data via repositories and Digital Object Identifiers (DOI

    Radio Resource Management Satellite Communication Network MCDM Method

    Get PDF
    Worldwide deployment of heterogeneous wireless networks is growing as a result of consumer demand for connectivity at all times and in all places. These customers' interest in multimedia apps like video streaming and VoIP, which demand tight Quality of Service (QoS) support, is growing at the same time. With such limitations, provisioning network resources is a difficult undertaking. In fact, it might be challenging for a network operator to identify trustworthy criteria to choose the optimum network that ensures user happiness while maximising network utilisation, given the availability of numerous access technologies (WiFi, WiMAX, or cellular networks). To solve this problem, each eNB just needs to learn the traffi c conditions or patterns of its owncell in our proposal. Wireless communication systems depend heavily on radio resource management (RRM). To ensure the efficient and successful operation of wireless networks, it involves the allocation and control of radio frequency spectrum, power, and other resources. RRM is significant because it can use scarce radio resources as efficiently as possible, enhancing capacity, lowering interference, and improving service quality. Successful deployment and operation of wireless communication systems like cellular networks, Wi-Fi, and Bluetooth depend on effective RRM approaches. The need for wireless communication is growing, and new technologies and standards are constantly being developed. The methodology of radio resource management (RRM) involves a variety of techniques and algorithms designed to allocate radio resources in a way that maximizes network performance while minimizing interference. Taken as alternate parameter is Laser communication, optical networks, satellite optical communication, vibrations, satellite networks. Taken as is solar radiation power, thermal bending, micro meteorite impact, solar and lunar gravity, earth oblations method. satellite optical communication has reached near 2000 data set compare other data set. The operation of wireless communication networks depends on radio resource management (RRM). Wireless networks would have interference, congestion, and a lacklustre level of service if effective RRM procedures weren't used. RRM is therefore a key component in ensuring that wireless communication systems can provide users with dependable and high-quality services

    Robo-line storage: Low latency, high capacity storage systems over geographically distributed networks

    Get PDF
    Rapid advances in high performance computing are making possible more complete and accurate computer-based modeling of complex physical phenomena, such as weather front interactions, dynamics of chemical reactions, numerical aerodynamic analysis of airframes, and ocean-land-atmosphere interactions. Many of these 'grand challenge' applications are as demanding of the underlying storage system, in terms of their capacity and bandwidth requirements, as they are on the computational power of the processor. A global view of the Earth's ocean chlorophyll and land vegetation requires over 2 terabytes of raw satellite image data. In this paper, we describe our planned research program in high capacity, high bandwidth storage systems. The project has four overall goals. First, we will examine new methods for high capacity storage systems, made possible by low cost, small form factor magnetic and optical tape systems. Second, access to the storage system will be low latency and high bandwidth. To achieve this, we must interleave data transfer at all levels of the storage system, including devices, controllers, servers, and communications links. Latency will be reduced by extensive caching throughout the storage hierarchy. Third, we will provide effective management of a storage hierarchy, extending the techniques already developed for the Log Structured File System. Finally, we will construct a protototype high capacity file server, suitable for use on the National Research and Education Network (NREN). Such research must be a Cornerstone of any coherent program in high performance computing and communications

    Scheduling Design and Performance Analysis of Carrier Aggregation in Satellite Communication Systems

    Get PDF
    Carrier Aggregation is one of the vital approaches to achieve several orders of magnitude increase in peak data rates. While carrier aggregation benefits have been extensively studied in cellular networks, its application to satellite systems has not been thoroughly explored yet. Carrier aggregation can offer an enhanced and more consistent quality of service for users throughout the satellite coverage via combining multiple carriers, utilizing the unused capacity at other carriers, and enabling effective interference management. Furthermore, carrier aggregation can be a prominent solution to address the issue of the spatially heterogeneous satellite traffic demand. This paper investigates introducing carrier aggregation to satellite systems from a link layer perspective. Deployment of carrier aggregation in satellite systems with the combination of multiple carriers that have different characteristics requires effective scheduling schemes for reliable communications. To this end, a novel load balancing scheduling algorithm has been proposed to distribute data packets across the aggregated carriers based on channel capacities and to utilize spectrum efficiently. Moreover, in order to ensure that the received data packets are delivered without perturbing the original transmission order, a perceptive scheduling algorithm has been developed that takes into consideration channel properties along with the instantaneous available resources at the aggregated carriers. The proposed modifications have been carefully designed to make carrier aggregation transparent above the medium access control (MAC) layer. Additionally, the complexity analysis of the proposed algorithms has been conducted in terms of the computational loads. Simulation results are provided to validate our analysis, demonstrate the design tradeoffs, and to highlight the potentials of carrier aggregation applied to satellite communication systems

    Measuring freshwater aquatic ecosystems: The need for a hyperspectral global mapping satellite mission

    Get PDF
    AbstractFreshwater ecosystems underpin global water and food security, yet are some of the most endangered ecosystems in the world because they are particularly vulnerable to land management change and climate variability. The US National Research Council's guidance to NASA regarding missions for the coming decade includes a polar orbiting, global mapping hyperspectral satellite remote sensing mission, the Hyperspectral Infrared Imager (HyspIRI), to make quantitative measurements of ecosystem change. Traditionally, freshwater ecosystems have been challenging to measure with satellite remote sensing because they are small and spatially complex, require high fidelity spectroradiometry, and are best described with biophysical variables derived from high spectral resolution data. In this study, we evaluate the contribution of a hyperspectral global mapping satellite mission to measuring freshwater ecosystems. We demonstrate the need for such a mission, and evaluate the suitability and gaps, through an examination of the measurement resolution issues impacting freshwater ecosystem measurements (spatial, temporal, spectral and radiometric). These are exemplified through three case studies that use remote sensing to characterize a component of freshwater ecosystems that drive primary productivity. The high radiometric quality proposed for the HyspIRI mission makes it uniquely well designed for measuring freshwater ecosystems accurately at moderate to high spatial resolutions. The spatial and spectral resolutions of the HyspIRI mission are well suited for the retrieval of multiple biophysical variables, such as phycocyanin and chlorophyll-a. The effective temporal resolution is suitable for characterizing growing season wetland phenology in temperate regions, but may not be appropriate for tracking algal bloom dynamics, or ecosystem responses to extreme events in monsoonal regions. Global mapping missions provide the systematic, repeated measurements necessary to measure the drivers of freshwater biodiversity change. Archival global mapping missions with open access and free data policies increase end user uptake globally. Overall, an archival, hyperspectral global mapping mission uniquely meets the measurement requirements of multiple end users for freshwater ecosystem science and management

    A Survey on Communication Networks for Electric System Automation

    Get PDF
    Published in Computer Networks 50 (2006) 877–897, an Elsevier journal. The definitive version of this publication is available from Science Direct. Digital Object Identifier:10.1016/j.comnet.2006.01.005In today’s competitive electric utility marketplace, reliable and real-time information become the key factor for reliable delivery of power to the end-users, profitability of the electric utility and customer satisfaction. The operational and commercial demands of electric utilities require a high-performance data communication network that supports both existing functionalities and future operational requirements. In this respect, since such a communication network constitutes the core of the electric system automation applications, the design of a cost-effective and reliable network architecture is crucial. In this paper, the opportunities and challenges of a hybrid network architecture are discussed for electric system automation. More specifically, Internet based Virtual Private Networks, power line communications, satellite communications and wireless communications (wireless sensor networks, WiMAX and wireless mesh networks) are described in detail. The motivation of this paper is to provide a better understanding of the hybrid network architecture that can provide heterogeneous electric system automation application requirements. In this regard, our aim is to present a structured framework for electric utilities who plan to utilize new communication technologies for automation and hence, to make the decision making process more effective and direct.This work was supported by NEETRAC under Project #04-157
    • …
    corecore