11,906 research outputs found

    Merging Special Collections with GIS Technology to Enhance the User Experience

    Get PDF
    This analysis evaluates how PhillyHistory.org merged their unique special collection materials with geospatial-based progressive technology to challenge and educate the global community. A new generation of technologically savvy researchers has emerged that expect a more enhanced user experience than earlier generations. To meet these needs, collection managers are collaborating with community and local institutions to increase online access to materials; mixing best metadata practices with custom elements to create map mashups; and merging progressive GIS technology and geospatial based applications with their collections to enhance the user experience. The PhillyHistory.org website was analyzed to explore how they used various geospatial technology to create a new type of digital content management system based on geographical information and make their collections accessible via online software and mobile applications

    A Survey of Volunteered Open Geo-Knowledge Bases in the Semantic Web

    Full text link
    Over the past decade, rapid advances in web technologies, coupled with innovative models of spatial data collection and consumption, have generated a robust growth in geo-referenced information, resulting in spatial information overload. Increasing 'geographic intelligence' in traditional text-based information retrieval has become a prominent approach to respond to this issue and to fulfill users' spatial information needs. Numerous efforts in the Semantic Geospatial Web, Volunteered Geographic Information (VGI), and the Linking Open Data initiative have converged in a constellation of open knowledge bases, freely available online. In this article, we survey these open knowledge bases, focusing on their geospatial dimension. Particular attention is devoted to the crucial issue of the quality of geo-knowledge bases, as well as of crowdsourced data. A new knowledge base, the OpenStreetMap Semantic Network, is outlined as our contribution to this area. Research directions in information integration and Geographic Information Retrieval (GIR) are then reviewed, with a critical discussion of their current limitations and future prospects

    Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services

    Full text link
    One of the most widely-implemented service standards provided by the Open Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS). WMS is widely employed globally, but there is limited knowledge of the global distribution, adoption status or the service quality of these online WMS resources. To fill this void, we investigated global WMSs resources and performed distributed performance monitoring of these services. This paper explicates a distributed monitoring framework that was used to monitor 46,296 WMSs continuously for over one year and a crawling method to discover these WMSs. We analyzed server locations, provider types, themes, the spatiotemporal coverage of map layers and the service versions for 41,703 valid WMSs. Furthermore, we appraised the stability and performance of basic operations for 1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major reasons for request errors and performance issues, as well as the relationship between service response times and the spatiotemporal distribution of client monitoring sites. This paper will help service providers, end users and developers of standards to grasp the status of global WMS resources, as well as to understand the adoption status of OGC standards. The conclusions drawn in this paper can benefit geospatial resource discovery, service performance evaluation and guide service performance improvements.Comment: 24 pages; 15 figure

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Internet of things

    Get PDF
    Manual of Digital Earth / Editors: Huadong Guo, Michael F. Goodchild, Alessandro Annoni .- Springer, 2020 .- ISBN: 978-981-32-9915-3Digital Earth was born with the aim of replicating the real world within the digital world. Many efforts have been made to observe and sense the Earth, both from space (remote sensing) and by using in situ sensors. Focusing on the latter, advances in Digital Earth have established vital bridges to exploit these sensors and their networks by taking location as a key element. The current era of connectivity envisions that everything is connected to everything. The concept of the Internet of Things(IoT)emergedasaholisticproposaltoenableanecosystemofvaried,heterogeneous networked objects and devices to speak to and interact with each other. To make the IoT ecosystem a reality, it is necessary to understand the electronic components, communication protocols, real-time analysis techniques, and the location of the objects and devices. The IoT ecosystem and the Digital Earth (DE) jointly form interrelated infrastructures for addressing today’s pressing issues and complex challenges. In this chapter, we explore the synergies and frictions in establishing an efficient and permanent collaboration between the two infrastructures, in order to adequately address multidisciplinary and increasingly complex real-world problems. Although there are still some pending issues, the identified synergies generate optimism for a true collaboration between the Internet of Things and the Digital Earth

    Technology Integration around the Geographic Information: A State of the Art

    Get PDF
    One of the elements that have popularized and facilitated the use of geographical information on a variety of computational applications has been the use of Web maps; this has opened new research challenges on different subjects, from locating places and people, the study of social behavior or the analyzing of the hidden structures of the terms used in a natural language query used for locating a place. However, the use of geographic information under technological features is not new, instead it has been part of a development and technological integration process. This paper presents a state of the art review about the application of geographic information under different approaches: its use on location based services, the collaborative user participation on it, its contextual-awareness, its use in the Semantic Web and the challenges of its use in natural languge queries. Finally, a prototype that integrates most of these areas is presented

    Historical collaborative geocoding

    Full text link
    The latest developments in digital have provided large data sets that can increasingly easily be accessed and used. These data sets often contain indirect localisation information, such as historical addresses. Historical geocoding is the process of transforming the indirect localisation information to direct localisation that can be placed on a map, which enables spatial analysis and cross-referencing. Many efficient geocoders exist for current addresses, but they do not deal with the temporal aspect and are based on a strict hierarchy (..., city, street, house number) that is hard or impossible to use with historical data. Indeed historical data are full of uncertainties (temporal aspect, semantic aspect, spatial precision, confidence in historical source, ...) that can not be resolved, as there is no way to go back in time to check. We propose an open source, open data, extensible solution for geocoding that is based on the building of gazetteers composed of geohistorical objects extracted from historical topographical maps. Once the gazetteers are available, geocoding an historical address is a matter of finding the geohistorical object in the gazetteers that is the best match to the historical address. The matching criteriae are customisable and include several dimensions (fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is to facilitate historical work, we also propose web-based user interfaces that help geocode (one address or batch mode) and display over current or historical topographical maps, so that they can be checked and collaboratively edited. The system is tested on Paris city for the 19-20th centuries, shows high returns rate and is fast enough to be used interactively.Comment: WORKING PAPE

    RAPID WEBGIS DEVELOPMENT FOR EMERGENCY MANAGEMENT

    Get PDF
    The use of spatial data during emergency response and management helps to make faster and better decisions. Moreover spatial data should be as much updated as possible and easy to access. To face the challenge of rapid and updated data sharing the most efficient solution is largely considered the use of internet where the field of web mapping is constantly evolving. ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) is a non profit association founded by Politecnico di Torino and SITI (Higher Institute for the Environmental Systems) as a joint project with the WFP (World Food Programme). The collaboration with the WFP drives some projects related to Early Warning Systems (i.e. flood and drought monitoring) and Early Impact Systems (e.g. rapid mapping and assessment through remote sensing systems). The Web GIS team has built and is continuously improving a complex architecture based entirely on Open Source tools. This architecture is composed by three main areas: the database environment, the server side logic and the client side logic. Each of them is implemented respecting the MCV (Model Controller View) pattern which means the separation of the different logic layers (database interaction, business logic and presentation). The MCV architecture allows to easily and fast build a Web GIS application for data viewing and exploration. In case of emergency data publication can be performed almost immediately as soon as data production is completed. The server side system is based on Python language and Django web development framework, while the client side on OpenLayers, GeoExt and Ext.js that manage data retrieval and user interface. The MCV pattern applied to javascript allows to keep the interface generation and data retrieval logic separated from the general application configuration, thus the server side environment can take care of the generation of the configuration file. The web application building process is data driven and can be considered as a view of the current architecture composed by data and data interaction tools. Once completely automated, the Web GIS application building process can be performed directly by the final user, that can customize data layers and controls to interact with the

    Development of Distributed Research Center for analysis of regional climatic and environmental changes

    Get PDF
    We present an approach and first results of a collaborative project being carried out by a joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center UNH, USA. Its main objective is development of a hardware and software platform prototype of a Distributed Research Center (DRC) for monitoring and projecting of regional climatic and environmental changes in the Northern extratropical areas. The DRC should provide the specialists working in climate related sciences and decision-makers with accurate and detailed climatic characteristics for the selected area and reliable and affordable tools for their in-depth statistical analysis and studies of the effects of climate change. Within the framework of the project, new approaches to cloud processing and analysis of large geospatial datasets (big geospatial data) inherent to climate change studies are developed and deployed on technical platforms of both institutions. We discuss here the state of the art in this domain, describe web based information-computational systems developed by the partners, justify the methods chosen to reach the project goal, and briefly list the results obtained so far
    corecore