102 research outputs found

    Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services

    Full text link
    One of the most widely-implemented service standards provided by the Open Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS). WMS is widely employed globally, but there is limited knowledge of the global distribution, adoption status or the service quality of these online WMS resources. To fill this void, we investigated global WMSs resources and performed distributed performance monitoring of these services. This paper explicates a distributed monitoring framework that was used to monitor 46,296 WMSs continuously for over one year and a crawling method to discover these WMSs. We analyzed server locations, provider types, themes, the spatiotemporal coverage of map layers and the service versions for 41,703 valid WMSs. Furthermore, we appraised the stability and performance of basic operations for 1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major reasons for request errors and performance issues, as well as the relationship between service response times and the spatiotemporal distribution of client monitoring sites. This paper will help service providers, end users and developers of standards to grasp the status of global WMS resources, as well as to understand the adoption status of OGC standards. The conclusions drawn in this paper can benefit geospatial resource discovery, service performance evaluation and guide service performance improvements.Comment: 24 pages; 15 figure

    Automatic integration of spatial data in viewing services

    Get PDF
    Geoportals are increasingly used for searching viewing and downloading spatial data. This study concerns methods to improve the visual presentation in viewing services. When spatial data in a viewing service are taken from more than one source there are often syntactic semantic topological and geometrical conflicts that prevent maps being fully consistent. In this study we extend a standard view service with methods to solve these conflicts. The methods are based on: (1) semantic labels of data in basic services (2) a rule-base in the portal layer and (3) integration methods in the portal layer. To evaluate the methodology we use a case study for adding historical borders on top of a base-map. The results show that the borders are overlaid on top of the map without conflicts and that a consistent map is generated automatically as an output. The methodology can be generalized to add other types of data on top of a base-map

    Geovisualization Using HTML5 : a case study to improve animations of historical geographic data

    Get PDF
    Popular science Visualize geographic data Using HTML5 The Scanian Economic-Demographic Database (SEDD) has been assembled by the Centre for Economic Demography (CED), Lund University. It contains demographic and economic information of Scania from the 17th century until the present. The SEDD database has been integrated with geographic data, which are digitized from four independent historical maps. To help the users well understand these data, a web mapping application called SEDD Map has been developed and tested. The previous version of SEDD Map is constructed using Silverlight plugin. It cannot run on most popular portable devices. As Hypertext Markup Languages (HTML) continue to develop, a recent version, HTML5, was published in 2012. It aims to support the latest multimedia formats and reduce the need for plugins. So, to improve the compatibility of SEDD Map, this work using HTML5 to developed a new version of SEDD Map. Before we constructed the new version of SEDD Map, a set of web mapping applications and programs were evaluated. From this evaluation and comparison, we found that SEDD Map could be improved in many area, such as improving the animation of historical geographic data. Animation is a useful tool when presenting historical data. The geographic data in SEDD Map are taken from four independent historical maps. To visualize geographic data as an animation, we need to create a time sense sequential dataset. In this study, we used linear interpolation and the four historical maps as start years and end years to simulate 159 maps to visualize the geographic data as animations. From this study, we found that: The commonly used web mapping applications for investigating demographic data contain functions, such as interactive visualization, statistical graphics, basic map tools, animations, etc; HTML5 can replace (and improve) the used of Silverlight for web mapping; Animations can be generated (filling in what is missing is to improve the data sets).The Scanian Economic-Demographic Database (SEDD) has been assembled by the Centre for Economic Demography (CED), Lund University. It contains information about the demographic and economic conditions of people that have lived in 5 parishes in Scania from the 17th century until the present. The SEDD database has been integrated with geographic data, which are digitized from four independent historical maps. To visualize and analyze these data, a GIS based web mapping application called SEDD Map has been developed and tested. The previous version of SEDD Map is constructed using Silverlight. As a result, it only can be used on computers which have installed the Silverlight plugin. As Hypertext Markup Languages (HTML) continue to develop, a recent version, HTML5, was published in 2012. It aims to support the latest multimedia formats and reduce the need for plugins. In this study, we use HTML5, Cascading Style Sheets (CSS3), JavaScript and the ArcGIS API for JavaScript to create a new version of SEDD Map to visualize data stored in the SEDD database. Before we constructed the new version of SEDD Map, a set of web mapping applications and programs were evaluated by the requirements which were needed to create the new version of SEDD Map. From this evaluation and comparison, we found that SEDD Map could be improved in many area, such as improving the animation of historical geographic data. Animation is a useful tool when presenting historical data. The geographic data in SEDD Map are taken from four independent historical maps. To visualize geographic data as an animation, we need to create a time sense sequential dataset, which is done in a parallel project. In this study, we evaluate techniques for data animation. We used linear interpolation and the four historical maps as start years and end years to simulate 159 maps to visualize the geographic data as animations. The conclusions are as follows: 1) The commonly used web mapping applications for investigating demographic data contain functions, such as interactive visualization, statistical graphics, basic map tools, animations, etc. 2) HTML5 can replace (and improve) the used of Silverlight for web mapping. 3) Animations can be generated (filling in what is missing is to improve the data sets)

    Geovisualization

    Get PDF
    Geovisualization involves the depiction of spatial data in an attempt to facilitate the interpretation of observational and simulated datasets through which Earth's surface and solid Earth processes may be understood. Numerous techniques can be applied to imagery, digital elevation models, and other geographic information system data layers to explore for patterns and depict landscape characteristics. Given the rapid proliferation of remotely sensed data and high-resolution digital elevation models, the focus is on the visualization of satellite imagery and terrain morphology, where manual human interpretation plays a fundamental role in the study of geomorphic processes and the mapping of landforms. A treatment of some techniques is provided that can be used to enhance satellite imagery and the visualization of the topography to improve landform identification as part of geomorphological mapping. Visual interaction with spatial data is an important part of exploring and understanding geomorphological datasets, and a variety of methods exist ranging across simple overlay, panning and zooming, 2.5D, 3D, and temporal analyses. Specific visualization outputs are also covered that focus on static and interactive methods of dissemination. Geomorphological mapping legends and the cartographic principles for map design are discussed, followed by details of dynamic web-based mapping systems that allow for greater immersive use by end users and the effective dissemination of data

    Web-based public participation GIS application : a case study on flood emergency management

    Get PDF
    Scientific summary The increasing prevalence of natural disasters is driving people to pay more and more attention to emergency management. Progress in catastrophe analysis capabilities based on Geographical Information System (GIS) may allow the needs of public participation to be considered. Synchronous data sharing between citizens and emergency workers could effectively promote the process of decision making. This thesis introduces an interactive web-based application which mainly deals with flood risk management in Kamloops in Canada. The application is built for citizens and emergency workers using three layers: (1) the client side is developed in HTML and JavaScript; (2) the web server layer, which connects the users and the database, is implemented in PHP; and (3) the database contains PostgreSQL, GeoServer and OSM. Except the city map, PostgreSQL stores the spatial information with the support of OpenGIS. Generally, the application meets the initial objectives. Citizens can access present shelter information and register their own requirements for shelter, while emergency workers have the power to manage all the shelters and warehouses based on the available flood information and figure out the supply allocation solution based on the response from the public. On the other hand, the application also provides useful routing functions for both citizens and emergency workers, such as searching the available shortest path to a shelter, and computing the optimized allocation routes between all the shelters and warehouses. This practical study proved that Public Participation GIS (PPGIS), combined with IT knowledge, can provide very useful tools for decision making when facing a flood risk.Popularized summary Nowadays, the growing prevalence of natural disasters is driving people to pay more and more attention to emergency management. Progress in catastrophe analysis capabilities based on Geographical Information System (GIS) may allow the needs of public participation to be considered. Synchronous data sharing between citizens and emergency workers could effectively promote the process of decision making. This thesis introduces an interactive web-based application which mainly deals with flood risk management in Kamloops in Canada. The application contains various data sources and adopts spatial database. Citizens can access present shelter information and register their own requirements for shelter, while emergency workers have the power to manage all the shelters and warehouses based on the available flood information and figure out the supply allocation solution based on the response from the public. On the other hand, the application also provides useful routing functions for both citizens and emergency workers, such as searching the available shortest path to a shelter, and computing the optimized allocation routes between all the shelters and warehouses. This practical study proved that Public Participation GIS (PPGIS), combined with IT knowledge, can provide very useful tools for decision making when facing a flood risk

    Implementing Web GIS for Monitoring Carbon Sequestration in Sustainable Agroforestry Projects

    Get PDF
    This project implemented an internet-based GIS to support effective monitoring and evaluation of agroforestry systems on carbon sequestration on small scale farms in East Africa. Small scale agriculture is one of the main economic activities practiced by farmers in East Africa. The demand for more farm produce out of the diminishing land exerts pressure on existing farms, resulting in land degradation and consequently environmental degeneration in the region. Striking a balance between conservation goals and agricultural needs is not easy. There is a need to utilize technological advancements like GIS to establish appropriate farming practices that ensure improved and sustained farm productivity, as well as to conserve the environment. Quantifying and monitoring sequestered carbon not only provides revenue through certified carbon credits, but also a means of evaluating the impact of agroforestry methods on the environment. This project was undertaken to support the Vi Agroforestry Programme’s implementation of agroforestry projects, and to assess the amount of carbon sequestered. An internet-based GIS system was designed to share spatial data and information to project stakeholders and other audiences. The system primarily supports decision making in adopting sustained farming practices, and provides a reliable means of keeping track of agroforestry techniques and quantifying the amount of sequestered carbon at each project

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc
 In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    Development of an open-source mobile application for emergency data collection

    Get PDF
    This Master degree project identified disasters and emergencies as a global humanitarian and technological challenge. Emergency management organizations' need for access to accurate and up-to-date information about the emergency situation, to help respond to, recover from and mitigate the effects of disasters and emergencies, presents a challenge to the field of Geomatics. Today the use of remote sensing technologies presents an increasing number of solutions. There are types of spatial data, however, e.g. submerged, non-visual or otherwise hidden features that still require emergency field personnel and volunteers to interpret and record. By utilizing the increasing ubiquity and computational power of modern smartphones, in order to reach a large number of potential users and volunteers, a mobile application for emergency field data collection was developed. It was developed as a component of a system that, in order to be as collaborative, adaptable and accessible as possible, also to resource-poor organizations, was, with a minor exception, completely open-source licensed. Field trials were held that, due to low participation, could not conclusively evaluate the application and its general applicability to emergency field data collection. They did, however, provide an adequate proof-of-concept and showed that it was possible to apply the application and the implemented system to a specific emergency field data collection task. The system has great collaborative potential, achieved through openness, mobility, standards compliance, multi-source capability and adaptability. Its administrators are given a high degree of control that lets them adapt the system to suit the current users and situation and its flexibility make it widely applicable, not only for emergency management. From literature, the field trials and the experience gained while developing and using the application, some ideas for improving the application and the system were discussed and some future research topics were suggested.Under och efter katastrofer och nödsituationer samlas mĂ„nga olika organisationer för att hjĂ€lpa de drabbade. Det kan vara t.ex. polis, brandkĂ„r, sjukvĂ„rd, eller elbolag som mĂ„ste reparera ledningsnĂ€t. Vid större katastrofer kan myndigheter och internationella hjĂ€lporganisationer ocksĂ„ behöva komma till undsĂ€ttning. För att dessa organisationer ska kunna hjĂ€lpa till pĂ„ ett effektivt sĂ€tt mĂ„ste de ha tillgĂ„ng till uppdaterad och korrekt information om krislĂ€get. En stor del av den hĂ€r informationen Ă€r kopplad till en specifik plats; den Ă€r geografisk. Idag fĂ„r organisationer som jobbar med krishantering mycket av sin geografiska information frĂ„n satelliter och flygbilder, men en del typer av information kan inte ses med satellit. Dessa kan vara t.ex. ledningar som ligger begravda under markytan eller mĂ€nskliga skador och behov. DĂ€rför behövs ocksĂ„ nĂ„gon form av system som personal och volontĂ€rer i fĂ€lt kan anvĂ€nda för att rapportera till krisledningscentraler pĂ„ ett effektivt sĂ€tt. MĂ„nga sĂ„dana system har historiskt sett varit dyra att skaffa eftersom de krĂ€vt avancerade datorprogram och dyr teknisk utrustning till personalen i fĂ€lt. Eftersom de dessutom mĂ„nga gĂ„nger varit svĂ„ra att anvĂ€nda har det varit svĂ„rt för krishanterings-organisationer att fĂ„ ihop tillrĂ€ckligt mĂ„nga personer att hjĂ€lpa till. Det hĂ€r projektet syftade till att utveckla en mobil-app, d.v.s. ett program till moderna mobiltelefoner (s.k. smartphones). MĂ„let med appen var att alla som Ă€ger en smartphone av rĂ€tt typ skulle kunna bidra till att samla viktig geografisk information till krisledningscentralen. Genom att lĂ„ta appen vara en del av ett system som Ă€r helt gratis att anvĂ€nda och med öppen kĂ€llkod, kan Ă€ven organisationer med smĂ„ resurser och lite pengar anvĂ€nda den. Tack vare att sĂ„ mĂ„nga redan Ă€ger smartphones som de dessutom redan Ă€r vana vid att anvĂ€nda kan det bli lĂ€ttare att fĂ„ fler att kunna medverka. Utvecklingen av appen lyckades och hela systemet Ă€r gratis att anvĂ€nda och utgivet – nĂ€stan – helt med öppen kĂ€llkod. Appen testades, men av för fĂ„ deltagare för att kunna dra nĂ„gra definitiva slutsatser om systemet Ă€r lĂ€mpligt att anvĂ€nda för krishantering. Dock visade appen och systemet god potential under testerna och att det var möjligt att anvĂ€nda appen för att samla information i en katastrofsituation

    A geo-database for potentially polluting marine sites and associated risk index

    Get PDF
    The increasing availability of geospatial marine data provides an opportunity for hydrographic offices to contribute to the identification of Potentially Polluting Marine Sites (PPMS). To adequately manage these sites, a PPMS Geospatial Database (GeoDB) application was developed to collect and store relevant information suitable for site inventory and geo-spatial analysis. The benefits of structuring the data to conform to the Universal Hydrographic Data Model (IHO S-100) and to use the Geographic Mark-Up Language (GML) for encoding are presented. A storage solution is proposed using a GML-enabled spatial relational database management system (RDBMS). In addition, an example of a risk index methodology is provided based on the defined data structure. The implementation of this example was performed using scripts containing SQL statements. These procedures were implemented using a cross-platform C++ application based on open-source libraries and called PPMS GeoDB Manager

    Design and development of a prototype mobile geographic information system for real-time collection and storage of traffic accident data

    Get PDF
    Today, Swedish police authorities nationwide collect data of traffic accidents. The information is stored in a national database managed by the Swedish Transport Agency and is an important resource in the process of analysing and improving road safety. Literature studies in this thesis, together with earlier work by the author have suggested that the data collection process is in need of an update: a digital tool for such data collection is necessary. Problems with varying quality of data and long submission times of reports have been attributed to the current method, which involves a paper form. A digital method including a handheld device is expected to improve data quality and shorten overall submission times. The aim of the thesis has thus been to design and develop a mobile GIS system for collection and management of traffic accident information for police authorities. The project has utilized mainly open source tools. The result is a system containing an Android application for data collection, a database server with a database for storage, an application server/web server to host a software server (map server), and a web server that handles requests and hosts a web service for viewing and retrieving data. The created system can collect all of the information that the currently used analogue method does as well as new media such as GPS coordinates, photographs and audio. The functionality of the web service demonstrates that data is collected and stored in suitable formats in a database schema that is flexible enough to facilitate a wide range of queries relevant to the field of road safety.The thesis describes the design and development of an Android mobile application for collecting and reporting information about traffic accidents. Additional components such as servers and a web service with a map are also included. Together with the mobile application they form a system for reporting, storage and analysis of information about traffic accidents. The tools and components that have been chosen are mainly open source, which means that they are accessible and free for everyone to use and create their own system. Swedish police authorities nationwide are currently collecting data of traffic accidents. The information, which is stored in a national database managed by the Swedish Transport Agency, is an important resource in the process of analysing and improving road safety. The data collection by the police has, since the beginning been intended to be conducted in the field on a hand held device, however this is not the case. Consequently the national database consists of information that has been collected by filling out paper forms that are later digitized. Varying data quality and long overall submission times have been noted as problems that can be attributed to this analogue step of the reporting process. It has been suggested that the data collection process is in need of an update: a new digital tool is necessary. The Android application, which is the main contribution, is intended as a suggestion to a digital replacement of the paper form. The functionality of the application is based on the requirements of the current paper form used by police. The created system can collect all of the information that the currently used analogue method does as well as new media such as GPS coordinates, photographs and audio. The functionality of the web service demonstrates that data is collected and stored in suitable formats in a database schema that is flexible enough to facilitate a wide range of queries relevant to the field of road safety
    • 

    corecore