69 research outputs found

    Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    Get PDF
    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability

    CID Survey Report Satellite Imagery and Associated Services used by the JRC. Current Status and Future Needs

    Get PDF
    The Agriculture and Fisheries Unit (IPSC) together with the Informatics, Networks and Library Unit (ISD) has performed this inventory called the Community Image Data portal Survey (the CID Survey); 20 Actions from 4 different Institutes (ISD, IPSC, IES, and IHCP) were interviewed. The objectives of the survey were to make an inventory of existing satellite data and future requirements; to obtain an overview of how data is acquired, used and stored; to quantify human and financial resources engaged in this process; to quantify storage needs and to query the staff involved in image acquisition and management on their needs and ideas for improvements in view of defining a single JRC portal through which imaging requests could be addressed. Within the JRC there are (including 2006) more than 700 000 low resolution (LR) and 50 000 medium resolution (MR) images, with time series as far back as 1981 for the LR data. There are more than 10 000 high resolution (HR) images and over 500 000 km2 of very high resolution (VHR) images. For the LR and MR data, cyclic global or continental coverage dominates, while the majority of HR and VHR data is acquired over Europe. The expected data purchase in the future (2007, 2008) known which enables good planning. Most purchases of VHR and HR data are made using the established FCs with common licensing terms. Otherwise multiple types of licensing govern data usage which emphasizes the need for CID to establish adequate means of data access. The total amount of image data stored (2006 inclusive) is 55 TB, with an expected increase of 80% in 2 years. Most of the image data is stored on internal network storage inside the corporate network which implies that the data is accessible from JRC, but difficulties arise when access is to be made by external users via Internet. In principle current storage capacity in the JRC could be enough, but available space is fragmented between Actions which therefore implies that a deficit in storage could arise. One solution to this issue is the sharing of a central storage service. Data reception is dominated by FTP data transfer which therefore requires reliable and fast Internet transfer bandwidth. High total volume for backup requires thorough definition of backup strategy. The user groups at JRC are heterogeneous which places requirements on CID to provide flexible authentication mechanisms. There is a requirement for a detailed analysis of all metadata standards needed for reference in a catalogue. There is a priority interest for such Catalogue Service and also for a centralized storage. The services to implement for data hosted on central storage should be WCS, WMS, file system access. During the analysis of the results mentioned above, some major areas could be identified as a base for common services to be provided to interested Actions, such as: provision of a centralized data storage facility with file serving functionality including authentication service, image catalogue services, data visualization and dissemination services. Specialized data services that require highly customized functionality with respect to certain properties of the different image types will usually remain the sole responsibility of the individual Actions. An orthorectification service for semi-automated orthorectification of HR and VHR data will be provided to certain Actions. At the end of the report some priorities and an implementation schedule for the Community Image Data portal (CID) are given.JRC.G.3-Agricultur

    Big Data Analytics for Earth Sciences: the EarthServer approach

    Get PDF
    Big Data Analytics is an emerging field since massive storage and computing capabilities have been made available by advanced e-infrastructures. Earth and Environmental sciences are likely to benefit from Big Data Analytics techniques supporting the processing of the large number of Earth Observation datasets currently acquired and generated through observations and simulations. However, Earth Science data and applications present specificities in terms of relevance of the geospatial information, wide heterogeneity of data models and formats, and complexity of processing. Therefore, Big Earth Data Analytics requires specifically tailored techniques and tools. The EarthServer Big Earth Data Analytics engine offers a solution for coverage-type datasets, built around a high performance array database technology, and the adoption and enhancement of standards for service interaction (OGC WCS and WCPS). The EarthServer solution, led by the collection of requirements from scientific communities and international initiatives, provides a holistic approach that ranges from query languages and scalability up to mobile access and visualization. The result is demonstrated and validated through the development of lighthouse applications in the Marine, Geology, Atmospheric, Planetary and Cryospheric science domains

    Earth Observation Open Science and Innovation

    Get PDF
    geospatial analytics; social observatory; big earth data; open data; citizen science; open innovation; earth system science; crowdsourced geospatial data; citizen science; science in society; data scienc

    Evolution of the Earth Observing System (EOS) Data and Information System (EOSDIS)

    Get PDF
    One of the strategic goals of the U.S. National Aeronautics and Space Administration (NASA) is to "Develop a balanced overall program of science, exploration, and aeronautics consistent with the redirection of the human spaceflight program to focus on exploration". An important sub-goal of this goal is to "Study Earth from space to advance scientific understanding and meet societal needs." NASA meets this subgoal in partnership with other U.S. agencies and international organizations through its Earth science program. A major component of NASA s Earth science program is the Earth Observing System (EOS). The EOS program was started in 1990 with the primary purpose of modeling global climate change. This program consists of a set of space-borne instruments, science teams, and a data system. The instruments are designed to obtain highly accurate, frequent and global measurements of geophysical properties of land, oceans and atmosphere. The science teams are responsible for designing the instruments as well as scientific algorithms to derive information from the instrument measurements. The data system, called the EOS Data and Information System (EOSDIS), produces data products using those algorithms as well as archives and distributes such products. The first of the EOS instruments were launched in November 1997 on the Japanese satellite called the Tropical Rainfall Measuring Mission (TRMM) and the last, on the U.S. satellite Aura, were launched in July 2004. The instrument science teams have been active since the inception of the program in 1990 and have participation from Brazil, Canada, France, Japan, Netherlands, United Kingdom and U.S. The development of EOSDIS was initiated in 1990, and this data system has been serving the user community since 1994. The purpose of this chapter is to discuss the history and evolution of EOSDIS since its beginnings to the present and indicate how it continues to evolve into the future. this chapter is organized as follows. Sect. 7.2 provides a discussion of EOSDIS, its elements and their functions. Sect. 7.3 provides details regarding the move towards more distributed systems for supporting both the core and community needs to be served by NASA Earth science data systems. Sect. 7.4 discusses the use of standards and interfaces and their importance in EOSDIS. Sect. 7.5 provides details about the EOSDIS Evolution Study. Sect. 7.6 presents the implementation of the EOSDIS Evolution plan. Sect. 7.7 briefly outlines the progress that the implementation has made towards the 2015 Vision, followed by a summary in Sect. 7.8

    Grid Enabled Geospatial Catalogue Web Service

    Get PDF
    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic~, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing

    Developing Data Extraction and Dynamic Data Visualization (Styling) Modules for Web GIS Risk Assessment System (WGRAS)

    Get PDF
    Interactive web-GIS tools play an important role in determining disaster risk assessment which ultimately result in reduction of unexpected damages, cost and saves millions of lives. Disaster management practitioners largely benefited information at their disposal about location where possible incidents are eminent, anticipate the impact and project possible outcomes to help mitigate and organize proper response. It is also important to note that, accurate and timely information is critical for coherent coordination in response to disasters. All the above can be achieved through proper data collection combined with computer assisted modelling, analysis, production and timely dissemination of spatial information. This Master’s thesis aims to extend features of Web GIS for Risk Assessment (WGRAS) project conducted at the Department of Physical Geography and Ecosystem Science at Lund University. The work includes development of tools for geospatial data acquisition and extraction from freely available external open non-commercial sources and dynamic, user-oriented map Visualization allowing user-defined symbolization and coloring resulting flexible visual portrayal of geospatial data in the web environment. In this regard, solutions are driven based upon open source, open data and implementation strictly complies with open web standard protocols and web services. As a result, WGRAS is furnished with easy and user driven raw geo-spatial data extracts for an area of interest from OpenStreetMap (OSM). Thus, data is automatically stored for later use for different spatial modelling and analysis. The second most important contribution of this thesis is the feature developed to solve visualization of geographic information through a map server where maps are generated with a pre-defined style that limits user’s visual needs. Visualization module enables dynamic definition of style (symbolization and coloring) data which assist non-GIS expert to produce instant and meaningful presentation of maps to the end user. Overall, the work in this practical thesis adds value to disaster management and analysis in terms of easy provision of data and enabling clear dissection of disaster prone areas using effective visualization mechanism.Interactive web-GIS tools play an important role in determining disaster risk assessment which ultimately result in reduction of unexpected damages, cost and saves millions of lives. Disaster management practitioners largely benefited information at their disposal about location where possible incidents are eminent, anticipate the impact and project possible outcomes to help mitigate and organized response. It is also important to note that, accurate and timely information is critical for coherent coordination in response to disasters. This can be achieved through proper data collection combined with computer assisted modelling, analysis, production and timely dissemination of spatial information. This Master’s thesis aims to extend features of Web GIS for Risk Assessment (WGRAS) project conducted at the Department of Physical Geography and Ecosystem Science at Lund University. Modules are developed to enable easy integration of geospatial data extraction from freely available sources which are open to use and non-commercial. Implementation is facilitated with intuitive user interface which allows extracts for an area by location name(s) or area defined by two latitude and two longitude values. The other major contribution of the study focuses on visualization of geographic information in the web environment. Currently, map servers use pre-defined styling mechanism which virtually doesn’t satisfy user’s visual needs. This module enable dynamic and user-oriented map visualization allowing non-GIS experts to define (symbolization and colouring) and produce instant and meaningful presentation of maps to the end user. As recommendation, visualization of geographic data in the web environment should further be examined, especially the map servers in use should integrate powerful and meaningful dynamic styling on top existing pre-defined style. In conclusion, this thesis adds value for disaster management and analysis in terms of easy provision of geographic data and enabling clear dissection of disaster prone areas using effective visualization mechanism

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc… In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    Workshop sensing a changing world : proceedings workshop November 19-21, 2008

    Get PDF

    CIRA annual report FY 2011/2012

    Get PDF
    • …
    corecore