680 research outputs found

    A grid-enabled Web Map server

    Get PDF
    Today Geographic Information Systems (GIS) provide several tools for studying and analyzing varied human and natural phenomena, therefore GIS and geospatial data has grown so much in both public and private organizations. A Challenge is the integration of these data to get innovative and exhaustive knowledge about topics of interest. In this paper we describe the design of a Web Map Service (WMS) OGC-compliant, through the use of grid computing technology and demonstrate how this approach can improve, w.r.t. security, performance, efficiency and scalability, the integration of geospatial multi-source data. End users, with a single sign-on, securely and transparently, gets maps whose data are distributed on heterogeneous data sources belonging to one o more Virtual Organizations via distributed queries in a grid computing environment

    Grid Enabled Geospatial Catalogue Web Service

    Get PDF
    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic~, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing

    Standardized Web Processing of Hydro-Engineering Operations

    Get PDF
    Mini-Symposium: Data Management in Hydro-Engineerin

    Establishing a persistent interoperability test-bed for European geospatial research

    Get PDF
    The development of standards for geospatial web services has been spearheaded by the Open Geospatial Consortium (OGC) - a group of over 370 private, public and academic organisations (OGC, 1999-2009). The OGC aims to facilitate interoperability between geospatial technologies through education, standards and other initiatives. The OGC Service Architecture, described in the international standard ISO 19119, offers an abstract specification for web services covering data dissemination, processing, portrayal, workflows and other areas. The development of specifications covering each of these categories of web services has led to a significant number of geospatial data and computational services available on the World Wide Web (the Web). A project1 to establish a persistent geospatial interoperability test-bed (PTB) was commissioned in 2007 by the Association of Geographic Information Laboratories in Europe (AGILE), Commission 5 (Networks) of the European Spatial Data Research (EuroSDR) organisation and the OGC

    The PREVIEW Global Risk Data Platform: a geoportal to serve and share global data on risk to natural hazards

    Get PDF
    With growing world population and concentration in urban and coastal areas, the exposure to natural hazards is increasing and results in higher risk of human and economic losses. Improving the identification of areas, population and assets potentially exposed to natural hazards is essential to reduce the consequences of such events. Disaster risk is a function of hazard, exposure and vulnerability. Modelling risk at the global level requires accessing and processing a large number of data, from numerous collaborating centres. <br><br> These data need to be easily updated, and there is a need for centralizing access to this information as well as simplifying its use for non GIS specialists. The Hyogo Framework for Action provides the mandate for data sharing, so that governments and international development agencies can take appropriate decision for disaster risk reduction. <br><br> Timely access and easy integration of geospatial data are essential to support efforts in Disaster Risk Reduction. However various issues in data availability, accessibility and integration limit the use of such data. In consequence, a framework that facilitate sharing and exchange of geospatial data on natural hazards should improve decision-making process. The PREVIEW Global Risk Data Platform is a highly interactive web-based GIS portal supported by a Spatial Data Infrastructure that offers free and interoperable access to more than 60 global data sets on nine types of natural hazards (tropical cyclones and related storm surges, drought, earthquakes, biomass fires, floods, landslides, tsunamis and volcanic eruptions) and related exposure and risk. This application portrays an easy-to-use online interactive mapping interface so that users can easily work with it and seamlessly integrate data in their own data flow using fully compliant OGC Web Services (OWS)

    Constructing Geo-Information Sharing GRID Architecture

    Get PDF

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc… In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    SwissEnvEO: A FAIR National Environmental Data Repository for Earth Observation Open Science

    Get PDF
    Environmental scientific research is highly becoming data-driven and dependent on high performance computing infrastructures to process ever increasing large volume and diverse data sets. Consequently, there is a growing recognition of the need to share data, methods, algorithms, and infrastructure to make scientific research more effective, efficient, open, transparent, reproducible, accessible, and usable by different users. However, Earth Observations (EO) Open Science is still undervalued, and different challenges remains to achieve the vision of transforming EO data into actionable knowledge by lowering the entry barrier to massive-use Big Earth Data analysis and derived information products. Currently, FAIR-compliant digital repositories cannot fully satisfy the needs of EO users, while Spatial Data Infrastructures (SDI) are not fully FAIR-compliant and have difficulties in handling Big Earth Data. In response to these issues and the need to strengthen Open and Reproducible EO science, this paper presents SwissEnvEO, a Spatial Data Infrastructure complemented with digital repository capabilities to facilitate the publication of Ready to Use information products, at national scale, derived from satellite EO data available in an EO Data Cube in full compliance with FAIR principles
    • …
    corecore