6,160 research outputs found

    Probabilistic latent semantic analysis as a potential method for integrating spatial data concepts

    Get PDF
    In this paper we explore the use of Probabilistic Latent Semantic Analysis (PLSA) as a method for quantifying semantic differences between land cover classes. The results are promising, revealing ‘hidden’ or not easily discernible data concepts. PLSA provides a ‘bottom up’ approach to interoperability problems for users in the face of ‘top down’ solutions provided by formal ontologies. We note the potential for a meta-problem of how to interpret the concepts and the need for further research to reconcile the top-down and bottom-up approaches

    Artificial neural networks in geospatial analysis

    Full text link
    Artificial neural networks are computational models widely used in geospatial analysis for data classification, change detection, clustering, function approximation, and forecasting or prediction. There are many types of neural networks based on learning paradigm and network architectures. Their use is expected to grow with increasing availability of massive data from remote sensing and mobile platforms

    Assessment of check dams’ role in flood hazard mapping in a semi-arid environment

    Get PDF
    This study aimed to examine flood hazard zoning and assess the role of check dams as effective hydraulic structures in reducing flood hazards. To this end, factors associated with topographic, hydrologic and human characteristics were used to develop indices for flood mapping and assessment. These indices and their components were weighed for flood hazard zoning using two methods: (i) a multi-criterion decision-making model in fuzzy logic and (ii) entropy weight. After preparing the flood hazard map by using the above indices and methods, the characteristics of the change‐point were used to assess the role of the check dams in reducing flood risk. The method was used in the Ilanlu catchment, located in the northwest of Hamadan province, Iran, where it is prone to frequent flood events. The results showed that the area of ‘very low’, ‘low’ and ‘moderate’ flood hazard zones increased from about 2.2% to 7.3%, 8.6% to 19.6% and 22.7% to 31.2% after the construction of check dams, respectively. Moreover, the area of ‘high’ and ‘very high’ flood hazard zones decreased from 39.8% to 29.6%, and 26.7% to 12.2%, respectively

    A Review on the Application of Natural Computing in Environmental Informatics

    Get PDF
    Natural computing offers new opportunities to understand, model and analyze the complexity of the physical and human-created environment. This paper examines the application of natural computing in environmental informatics, by investigating related work in this research field. Various nature-inspired techniques are presented, which have been employed to solve different relevant problems. Advantages and disadvantages of these techniques are discussed, together with analysis of how natural computing is generally used in environmental research.Comment: Proc. of EnviroInfo 201

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Historical collaborative geocoding

    Full text link
    The latest developments in digital have provided large data sets that can increasingly easily be accessed and used. These data sets often contain indirect localisation information, such as historical addresses. Historical geocoding is the process of transforming the indirect localisation information to direct localisation that can be placed on a map, which enables spatial analysis and cross-referencing. Many efficient geocoders exist for current addresses, but they do not deal with the temporal aspect and are based on a strict hierarchy (..., city, street, house number) that is hard or impossible to use with historical data. Indeed historical data are full of uncertainties (temporal aspect, semantic aspect, spatial precision, confidence in historical source, ...) that can not be resolved, as there is no way to go back in time to check. We propose an open source, open data, extensible solution for geocoding that is based on the building of gazetteers composed of geohistorical objects extracted from historical topographical maps. Once the gazetteers are available, geocoding an historical address is a matter of finding the geohistorical object in the gazetteers that is the best match to the historical address. The matching criteriae are customisable and include several dimensions (fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is to facilitate historical work, we also propose web-based user interfaces that help geocode (one address or batch mode) and display over current or historical topographical maps, so that they can be checked and collaboratively edited. The system is tested on Paris city for the 19-20th centuries, shows high returns rate and is fast enough to be used interactively.Comment: WORKING PAPE

    An Introduction to Ontology

    Get PDF
    Analytical philosophy of the last one hundred years has been heavily influenced by a doctrine to the effect that one can arrive at a correct ontology by paying attention to certain superficial (syntactic) features of first-order predicate logic as conceived by Frege and Russell. More specifically, it is a doctrine to the effect that the key to the ontological structure of reality is captured syntactically in the ‘Fa’ (or, in more sophisticated versions, in the ‘Rab’) of first-order logic, where ‘F’ stands for what is general in reality and ‘a’ for what is individual. Hence “f(a)ntology”. Because predicate logic has exactly two syntactically different kinds of referring expressions—‘F’, ‘G’, ‘R’, etc., and ‘a’, ‘b’, ‘c’, etc.—so reality must consist of exactly two correspondingly different kinds of entity: the general (properties, concepts) and the particular (things, objects), the relation between these two kinds of entity being revealed in the predicate-argument structure of atomic formulas in first-order logic

    A Cognitive Model of an Epistemic Community: Mapping the Dynamics of Shallow Lake Ecosystems

    Full text link
    We used fuzzy cognitive mapping (FCM) to develop a generic shallow lake ecosystem model by augmenting the individual cognitive maps drawn by 8 scientists working in the area of shallow lake ecology. We calculated graph theoretical indices of the individual cognitive maps and the collective cognitive map produced by augmentation. The graph theoretical indices revealed internal cycles showing non-linear dynamics in the shallow lake ecosystem. The ecological processes were organized democratically without a top-down hierarchical structure. The steady state condition of the generic model was a characteristic turbid shallow lake ecosystem since there were no dynamic environmental changes that could cause shifts between a turbid and a clearwater state, and the generic model indicated that only a dynamic disturbance regime could maintain the clearwater state. The model developed herein captured the empirical behavior of shallow lakes, and contained the basic model of the Alternative Stable States Theory. In addition, our model expanded the basic model by quantifying the relative effects of connections and by extending it. In our expanded model we ran 4 simulations: harvesting submerged plants, nutrient reduction, fish removal without nutrient reduction, and biomanipulation. Only biomanipulation, which included fish removal and nutrient reduction, had the potential to shift the turbid state into clearwater state. The structure and relationships in the generic model as well as the outcomes of the management simulations were supported by actual field studies in shallow lake ecosystems. Thus, fuzzy cognitive mapping methodology enabled us to understand the complex structure of shallow lake ecosystems as a whole and obtain a valid generic model based on tacit knowledge of experts in the field.Comment: 24 pages, 5 Figure
    • 

    corecore