9,881 research outputs found

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Geospatial information infrastructures

    Get PDF
    Manual of Digital Earth / Editors: Huadong Guo, Michael F. Goodchild, Alessandro Annoni .- Springer, 2020 .- ISBN: 978-981-32-9915-3Geospatial information infrastructures (GIIs) provide the technological, semantic,organizationalandlegalstructurethatallowforthediscovery,sharing,and use of geospatial information (GI). In this chapter, we introduce the overall concept and surrounding notions such as geographic information systems (GIS) and spatial datainfrastructures(SDI).WeoutlinethehistoryofGIIsintermsoftheorganizational andtechnologicaldevelopmentsaswellasthecurrentstate-of-art,andreflectonsome of the central challenges and possible future trajectories. We focus on the tension betweenincreasedneedsforstandardizationandtheever-acceleratingtechnological changes. We conclude that GIIs evolved as a strong underpinning contribution to implementation of the Digital Earth vision. In the future, these infrastructures are challengedtobecomeflexibleandrobustenoughtoabsorbandembracetechnological transformationsandtheaccompanyingsocietalandorganizationalimplications.With this contribution, we present the reader a comprehensive overview of the field and a solid basis for reflections about future developments

    EcoGIS – GIS tools for ecosystem approaches to fisheries management

    Get PDF
    Executive Summary: The EcoGIS project was launched in September 2004 to investigate how Geographic Information Systems (GIS), marine data, and custom analysis tools can better enable fisheries scientists and managers to adopt Ecosystem Approaches to Fisheries Management (EAFM). EcoGIS is a collaborative effort between NOAA’s National Ocean Service (NOS) and National Marine Fisheries Service (NMFS), and four regional Fishery Management Councils. The project has focused on four priority areas: Fishing Catch and Effort Analysis, Area Characterization, Bycatch Analysis, and Habitat Interactions. Of these four functional areas, the project team first focused on developing a working prototype for catch and effort analysis: the Fishery Mapper Tool. This ArcGIS extension creates time-and-area summarized maps of fishing catch and effort from logbook, observer, or fishery-independent survey data sets. Source data may come from Oracle, Microsoft Access, or other file formats. Feedback from beta-testers of the Fishery Mapper was used to debug the prototype, enhance performance, and add features. This report describes the four priority functional areas, the development of the Fishery Mapper tool, and several themes that emerged through the parallel evolution of the EcoGIS project, the concept and implementation of the broader field of Ecosystem Approaches to Management (EAM), data management practices, and other EAM toolsets. In addition, a set of six succinct recommendations are proposed on page 29. One major conclusion from this work is that there is no single “super-tool” to enable Ecosystem Approaches to Management; as such, tools should be developed for specific purposes with attention given to interoperability and automation. Future work should be coordinated with other GIS development projects in order to provide “value added” and minimize duplication of efforts. In addition to custom tools, the development of cross-cutting Regional Ecosystem Spatial Databases will enable access to quality data to support the analyses required by EAM. GIS tools will be useful in developing Integrated Ecosystem Assessments (IEAs) and providing pre- and post-processing capabilities for spatially-explicit ecosystem models. Continued funding will enable the EcoGIS project to develop GIS tools that are immediately applicable to today’s needs. These tools will enable simplified and efficient data query, the ability to visualize data over time, and ways to synthesize multidimensional data from diverse sources. These capabilities will provide new information for analyzing issues from an ecosystem perspective, which will ultimately result in better understanding of fisheries and better support for decision-making. (PDF file contains 45 pages.

    Assessing the utility of geospatial technologies to investigate environmental change within lake systems

    Get PDF
    Over 50% of the world's population live within 3. km of rivers and lakes highlighting the on-going importance of freshwater resources to human health and societal well-being. Whilst covering c. 3.5% of the Earth's non-glaciated land mass, trends in the environmental quality of the world's standing waters (natural lakes and reservoirs) are poorly understood, at least in comparison with rivers, and so evaluation of their current condition and sensitivity to change are global priorities. Here it is argued that a geospatial approach harnessing existing global datasets, along with new generation remote sensing products, offers the basis to characterise trajectories of change in lake properties e.g., water quality, physical structure, hydrological regime and ecological behaviour. This approach furthermore provides the evidence base to understand the relative importance of climatic forcing and/or changing catchment processes, e.g. land cover and soil moisture data, which coupled with climate data provide the basis to model regional water balance and runoff estimates over time. Using examples derived primarily from the Danube Basin but also other parts of the World, we demonstrate the power of the approach and its utility to assess the sensitivity of lake systems to environmental change, and hence better manage these key resources in the future

    Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters

    Full text link
    Demands on the disaster response capacity of the European Union are likely to increase, as the impacts of disasters continue to grow both in size and frequency. This has resulted in intensive research on issues concerning spatially-explicit information and modelling and their multiple sources of uncertainty. Geospatial support is one of the forms of assistance frequently required by emergency response centres along with hazard forecast and event management assessment. Robust modelling of natural hazards requires dynamic simulations under an array of multiple inputs from different sources. Uncertainty is associated with meteorological forecast and calibration of the model parameters. Software uncertainty also derives from the data transformation models (D-TM) needed for predicting hazard behaviour and its consequences. On the other hand, social contributions have recently been recognized as valuable in raw-data collection and mapping efforts traditionally dominated by professional organizations. Here an architecture overview is proposed for adaptive and robust modelling of natural hazards, following the Semantic Array Programming paradigm to also include the distributed array of social contributors called Citizen Sensor in a semantically-enhanced strategy for D-TM modelling. The modelling architecture proposes a multicriteria approach for assessing the array of potential impacts with qualitative rapid assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema and complementing more traditional and accurate a-posteriori assessment. We discuss the computational aspect of environmental risk modelling using array-based parallel paradigms on High Performance Computing (HPC) platforms, in order for the implications of urgency to be introduced into the systems (Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua

    Republishing OpenStreetMap’s roads as linked routable tiles

    Get PDF
    Route planning providers manually integrate different geo-spatial datasets before offering a Web service to developers, thus creating a closed world view. In contrast, combining open datasets at runtime can provide more information for user-specific route planning needs. For example, an extra dataset of bike sharing availabilities may provide more relevant information to the occasional cyclist. A strategy for automating the adoption of open geo-spatial datasets is needed to allow an ecosystem of route planners able to answer more specific and complex queries. This raises new challenges such as (i) how open geo-spatial datasets should be published on the Web to raise interoperability, and (ii) how route planners can discover and integrate relevant data for a certain query on the fly. We republished OpenStreetMap's road network as "Routable Tiles" to facilitate its integration into open route planners. To achieve this, we use a Linked Data strategy and follow an approach similar to vector tiles. In a demo, we show how client-side code can automatically discover tiles and perform a shortest path algorithm. We provide four contributions: (i) we launched an open geo-spatial dataset that is available for everyone to reuse at no cost, (ii) we published a Linked Data version of the OpenStreetMap ontology, (iii) we introduced a hypermedia specification for vector tiles that extends the Hydra ontology, and (iv) we released the mapping scripts, demo and routing scripts as open source software

    Collection and integration of local knowledge and experience through a collective spatial analysis

    Get PDF
    This article discusses the convenience of adopting an approach of Collective Spatial Analysis in the P/PGIS processes, with the aim of improving the collection and integration of knowledge and local expertise in decision-making, mainly in the fields of planning and adopting territorial policies. Based on empirical evidence, as a result of the review of scientific articles from the Web of Science database, in which it is displayed how the knowledge and experience of people involved in decision-making supported by P/PGIS are collected and used, a prototype of a WEB-GSDSS application has been developed. This prototype allows a group of people to participate anonymously, in an asynchronous and distributed way, in a decision-making process to locate goods, services, or events through the convergence of their views. Via this application, two case studies for planning services in districts of Ecuador and Italy were carried out. Early results suggest that in P/PGIS local and external actors contribute their knowledge and experience to generate information that afterwards is integrated and analysed in the decision-making process. On the other hand, in a Collective Spatial Analysis, these actors analyse and generate information in conjunction with their knowledge and experience during the process of decision-making. We conclude that, although the Collective Spatial Analysis approach presented is in a subjective and initial stage, it does drive improvements in the collection and integration of knowledge and local experience, foremost among them is an interdisciplinary geo-consensusPeer ReviewedPostprint (published version

    Integrative geospatial modeling: combining local and indigenous knowledge with geospatial applications for adaptive governance of invasive species and ecosystem services

    Get PDF
    Includes bibliographical references.2015 Summer.With an unprecedented rate of global change, diverse anthropogenic disturbances present growing challenges for coupled social-ecological systems. Biological invasions are one such disturbance known to cause negative impacts on biodiversity, ecosystem functioning and an array of other natural processes and human activities. Maps facilitated by advanced geospatial applications play a major role in resource management and conservation planning. However, local and indigenous knowledge are overwhelmingly left out of these conversations, despite the wealth of observational data held by resource-dependent communities and the potential negative impacts biological invasions have on local livelihoods. My integrative geospatial modeling research applied adaptive governance mechanisms of knowledge integration and co-production processes in concert with species distribution modeling tools to explore the potential threat of invasive plants to community-defined ecosystem services. Knowledge integration at the landscape scale in Alaska provided an important opportunity for re-framing risk assessment mapping to include Native Alaskan community concerns, and revealed the growing potential threat posed by invasive aquatic Elodea spp. to Chinook salmon (Oncorhynchus tshawytscha) and whitefish (Coregonus nelsonii) subsistence under current and future climate conditions. Knowledge integration and co-production at the local scale in northeastern Ethiopia facilitated shared learning between pastoral communities and researchers, leading to the discovery of invasive rubber vine (Cryptostegia grandiflora), which was previously unknown to my research team or a number of government and aid organizations working in the region, thus providing a potentially robust early detection and monitoring approach for an invasive plant that holds acute negative impacts on a number of endemic ecosystem service-providing trees. This work revealed knowledge integration and co-production processes and species distribution modeling tools to be complimentary, with invasive species acting as a useful boundary-spanning issue for bringing together diverse knowledge sources. Moreover, bridging and boundary-spanning organizations and individuals enhanced this rapid appraisal process by providing access to local and indigenous communities and fostered a level of built-in trust and legitimacy with them. Challenges to this work still remain, including effectively working at broad spatial and governance scales, sustaining iterative processes that involve communities in validating and critiquing model outputs, and addressing underlying power disparities between stakeholder groups. Top-down, discipline-specific approaches fail to adequately address the complexity of ecosystems or the needs of resource-dependent communities. My work lends evidence to the power of integrative geospatial modeling as a flexible transdisciplinary methodology for addressing conservation efforts in rural regions with mounting anthropogenic pressures at different spatial and governance scales

    Software tools for conducting bibliometric analysis in science: An up-to-date review

    Get PDF
    Bibliometrics has become an essential tool for assessing and analyzing the output of scientists, cooperation between universities, the effect of state-owned science funding on national research and development performance and educational efficiency, among other applications. Therefore, professionals and scientists need a range of theoretical and practical tools to measure experimental data. This review aims to provide an up-to-date review of the various tools available for conducting bibliometric and scientometric analyses, including the sources of data acquisition, performance analysis and visualization tools. The included tools were divided into three categories: general bibliometric and performance analysis, science mapping analysis, and libraries; a description of all of them is provided. A comparative analysis of the database sources support, pre-processing capabilities, analysis and visualization options were also provided in order to facilitate its understanding. Although there are numerous bibliometric databases to obtain data for bibliometric and scientometric analysis, they have been developed for a different purpose. The number of exportable records is between 500 and 50,000 and the coverage of the different science fields is unequal in each database. Concerning the analyzed tools, Bibliometrix contains the more extensive set of techniques and suitable for practitioners through Biblioshiny. VOSviewer has a fantastic visualization and is capable of loading and exporting information from many sources. SciMAT is the tool with a powerful pre-processing and export capability. In views of the variability of features, the users need to decide the desired analysis output and chose the option that better fits into their aims
    corecore