2,000 research outputs found

    Identifying success factors in crowdsourced geographic information use in government

    Get PDF
    Crowdsourcing geographic information in government is focusing on projects that are engaging people who are not government officials and employees in collecting, editing and sharing information with governmental bodies. This type of projects emerged in the past decade, due to technological and societal changes - such as the increased use of smartphones, combined with growing levels of education and technical abilities to use them by citizens. They also flourished due to the need for updated data in relatively quick time when financial resources are low. They range from recording the experience of feeling an earthquake to recording the location of businesses during the summer time. 50 cases of projects in which crowdsourced geographic information was used by governmental bodies across the world are analysed. About 60% of the cases were examined in 2014 and in 2017, to allow for comparison and identification of success and failure. The analysis looked at different aspects and their relationship to success: the drivers to start a project; scope and aims; stakeholders and relationships; inputs into the project; technical and organisational aspect; and problems encountered. The main key factors of the case studies were analysed with the use of Qualitative Comparative Analysis (QCA) which is an analytical method that combines quantitative and qualitative tools in sociological research. From the analysis, we can conclude that there is no “magic bullet” or a perfect methodology for a successful crowdsourcing in government project. Unless the organisation has reached maturity in the area of crowdsourcing, identifying a champion and starting a project that will not address authoritative datasets directly is a good way to ensure early success and start the process of organisational learning on how to run such projects. Governmental support and trust is undisputed. If the choice is to use new technologies, this should be accompanied by an investment of appropriate resources within the organisation to ensure that the investment bear fruits. Alternatively, using an existing technology that was successful elsewhere and investing in training and capacity building is another path for success. We also identified the importance of intermediary Non-Governmental Organizations (NGOs) with the experience and knowledge in working with crowdsourcing within a partnership. These organizations have the knowledge and skills to implement projects at the boundary between government and the crowd, and therefore can offer the experience to ensure better implementation. Changes and improvement of public services, or a focus on environmental monitoring can be a good basis for a project. Capturing base mapping is a good point to start, too. The recommendation of the report address organisational issues, resources, and legal aspects

    GMES-service for assessing and monitoring subsidence hazards in coastal lowland areas around Europe. SubCoast D3.5.1

    Get PDF
    This document is version two of the user requirements for SubCoast work package 3.5, it is SubCoast deliverable 3.5.1. Work package 3.5 aims to provide a European integrated GIS product on subsidence and relative sea level rise. The first step of this process was to contact the European Environment Agency as the main user to discover their user requirements. This document presents these requirments, the outline methodology that will be used to carry out the integration and the datasets that will be used. In outline the main user requirements of the EEA are: 1. Gridded approach using an Inspire compliant grid 2. The grid would hold data on: a. Likely rate of subsidence b. RSLR c. Impact (Vulnerability) d. Certainty (confidence map) e. Contribution of ground motion to RSLR f. A measure of certainty in the data provided g. Metadata 3. Spatial Coverage - Ideally entire coastline of all 37 member states a. Spatial resolution - 1km 4. Provide a measure of the degree of contribution of ground motion to RSLR The European integration will be based around a GIS methodology. Datasets will be integrated and interpreted to provide information on data vlues above. The main value being a likelyhood of Subsidence. This product will initially be developed at it’s lowest level of detail for the London area. BGS have a wealth of data for london this will enable this less detialed product to be validated and also enable the generation of a more detailed product usig the best data availible. One the methodology has been developed it will be pushed out to other areas of the ewuropean coastline. The initial input data that have been reviewed for their suitability for the European integration are listed below. Thesea re the datasets that have European wide availibility, It is expected that more detailed datasets will be used in areas where they are avaiilble. 1. Terrafirma Data 2. One Geology 3. One Geology Europe 4. Population Density (Geoland2) 5. The Urban Atlas (Geoland2) 6. Elevation Data a. SRTM b. GDEM c. GTOPO 30 d. NextMap Europe 7. MyOceans Sea Level Data 8. Storm Surge Locations 9. European Environment Agencya. Elevation breakdown 1km b. Corine Land Cover 2000 (CLC2000) coastline c. Sediment Discharges d. Shoreline e. Maritime Boundaries f. Hydrodynamics and Sea Level Rise g. Geomorphology, Geology, Erosion Trends and Coastal Defence Works h. Corine land cover 1990 i. Five metre elevation contour line 10. FutureCoas

    CID Survey Report Satellite Imagery and Associated Services used by the JRC. Current Status and Future Needs

    Get PDF
    The Agriculture and Fisheries Unit (IPSC) together with the Informatics, Networks and Library Unit (ISD) has performed this inventory called the Community Image Data portal Survey (the CID Survey); 20 Actions from 4 different Institutes (ISD, IPSC, IES, and IHCP) were interviewed. The objectives of the survey were to make an inventory of existing satellite data and future requirements; to obtain an overview of how data is acquired, used and stored; to quantify human and financial resources engaged in this process; to quantify storage needs and to query the staff involved in image acquisition and management on their needs and ideas for improvements in view of defining a single JRC portal through which imaging requests could be addressed. Within the JRC there are (including 2006) more than 700 000 low resolution (LR) and 50 000 medium resolution (MR) images, with time series as far back as 1981 for the LR data. There are more than 10 000 high resolution (HR) images and over 500 000 km2 of very high resolution (VHR) images. For the LR and MR data, cyclic global or continental coverage dominates, while the majority of HR and VHR data is acquired over Europe. The expected data purchase in the future (2007, 2008) known which enables good planning. Most purchases of VHR and HR data are made using the established FCs with common licensing terms. Otherwise multiple types of licensing govern data usage which emphasizes the need for CID to establish adequate means of data access. The total amount of image data stored (2006 inclusive) is 55 TB, with an expected increase of 80% in 2 years. Most of the image data is stored on internal network storage inside the corporate network which implies that the data is accessible from JRC, but difficulties arise when access is to be made by external users via Internet. In principle current storage capacity in the JRC could be enough, but available space is fragmented between Actions which therefore implies that a deficit in storage could arise. One solution to this issue is the sharing of a central storage service. Data reception is dominated by FTP data transfer which therefore requires reliable and fast Internet transfer bandwidth. High total volume for backup requires thorough definition of backup strategy. The user groups at JRC are heterogeneous which places requirements on CID to provide flexible authentication mechanisms. There is a requirement for a detailed analysis of all metadata standards needed for reference in a catalogue. There is a priority interest for such Catalogue Service and also for a centralized storage. The services to implement for data hosted on central storage should be WCS, WMS, file system access. During the analysis of the results mentioned above, some major areas could be identified as a base for common services to be provided to interested Actions, such as: provision of a centralized data storage facility with file serving functionality including authentication service, image catalogue services, data visualization and dissemination services. Specialized data services that require highly customized functionality with respect to certain properties of the different image types will usually remain the sole responsibility of the individual Actions. An orthorectification service for semi-automated orthorectification of HR and VHR data will be provided to certain Actions. At the end of the report some priorities and an implementation schedule for the Community Image Data portal (CID) are given.JRC.G.3-Agricultur

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc… In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    A reference data access service in support of emergency management

    Get PDF
    In the field of natural disasters recovery and reduction and of emergency management georeferenced information is strongly needed. In my personal experience obtained in the three years period spent at ITHACA, during the shorter at GFDRR Labs and through the work done indirectly with UN-WFP, after a natural disaster occurs, the experts in geomatics are often asked to provide answers to questions such as: where did it occur? How many people have been involved? How many infrastructures have been damaged and to what extent? How much is the economical loss? Geomatics can give answer to all these questions or give significant help in addressing operations in order to get the answers. The goal can be reached both with the use of base reference data, the ones usually contained in the classic cartography, and by exploiting value added information coming from satellite and aerial data processing, classic surveys and GPS acquisition on the fiel

    Challenges in Arctic Navigation and Geospatial Data : User Perspective and Solutions Roadmap

    Get PDF
    Navigation and location-based applications, including business such as transport, tourism, and mining, in Arctic areas face a variety of specific challenges. In fact, these challenges concern not only the Arctic Circle but certain other areas as well, such as the Gulf of Bothnia. This report provides a review on these challengs which concern a variety of technologies ranging from satellite navigation to telecommunications and mapping. In order to find out end-users' views on the significance of Arctic challenges, an online survey was conducted. The 77 respondents representing all Arctic countries, the majority being from Finland, highlighted the challenges in telecommunications as well as accuracy concerns for emerging applications dealing with precise navigation. This report provides a review of possible technologies for addressing the Arctic challenges, based on which a road map for solving them is developed. The road map also uses the results of expert working groups from the Challenges in Arctic Navigation workshop arranged in April 2018 in Olos, Muonio, Finland. This report was produced within the ARKKI project. It was funded by the Finnish Ministry of Foreign Affairs under the Baltic Sea, Barents and Arctic cooperation programme, and implemented by the Finnish Geospatial Research Institute in collaboration with the Finnish Ministry of Transport and Communications

    Applications of Satellite Earth Observations section - NEODAAS: Providing satellite data for efficient research

    Get PDF
    The NERC Earth Observation Data Acquisition and Analysis Service (NEODAAS) provides a central point of Earth Observation (EO) satellite data access and expertise for UK researchers. The service is tailored to individual users’ requirements to ensure that researchers can focus effort on their science, rather than struggling with correct use of unfamiliar satellite data

    Satellite monitoring of harmful algal blooms (HABs) to protect the aquaculture industry

    Get PDF
    Harmful algal blooms (HABs) can cause sudden and considerable losses to fish farms, for example 500,000 salmon during one bloom in Shetland, and also present a threat to human health. Early warning allows the industry to take protective measures. PML's satellite monitoring of HABs is now funded by the Scottish aquaculture industry. The service involves processing EO ocean colour data from NASA and ESA in near-real time, and applying novel techniques for discriminating certain harmful blooms from harmless algae. Within the AQUA-USERS project we are extending this capability to further HAB species within several European countries
    corecore