7,108 research outputs found

    Historical collaborative geocoding

    Full text link
    The latest developments in digital have provided large data sets that can increasingly easily be accessed and used. These data sets often contain indirect localisation information, such as historical addresses. Historical geocoding is the process of transforming the indirect localisation information to direct localisation that can be placed on a map, which enables spatial analysis and cross-referencing. Many efficient geocoders exist for current addresses, but they do not deal with the temporal aspect and are based on a strict hierarchy (..., city, street, house number) that is hard or impossible to use with historical data. Indeed historical data are full of uncertainties (temporal aspect, semantic aspect, spatial precision, confidence in historical source, ...) that can not be resolved, as there is no way to go back in time to check. We propose an open source, open data, extensible solution for geocoding that is based on the building of gazetteers composed of geohistorical objects extracted from historical topographical maps. Once the gazetteers are available, geocoding an historical address is a matter of finding the geohistorical object in the gazetteers that is the best match to the historical address. The matching criteriae are customisable and include several dimensions (fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is to facilitate historical work, we also propose web-based user interfaces that help geocode (one address or batch mode) and display over current or historical topographical maps, so that they can be checked and collaboratively edited. The system is tested on Paris city for the 19-20th centuries, shows high returns rate and is fast enough to be used interactively.Comment: WORKING PAPE

    Mining Temporal Association Rules with Temporal Soft Sets

    Get PDF
    This work was partially supported by the National Natural Science Foundation of China (grant no. 11301415), the Shaanxi Provincial Key Research and Development Program (grant no. 2021SF-480), and the Natural Science Basic Research Plan in Shaanxi Province of China (grant no. 2018JM1054).Traditional association rule extraction may run into some difficulties due to ignoring the temporal aspect of the collected data. Particularly, it happens in many cases that some item sets are frequent during specific time periods, although they are not frequent in the whole data set. In this study, we make an effort to enhance conventional rule mining by introducing temporal soft sets. We define temporal granulation mappings to induce granular structures for temporal transaction data. Using this notion, we define temporal soft sets and their Q-clip soft sets to establish a novel framework for mining temporal association rules. A number of useful characterizations and results are obtained, including a necessary and sufficient condition for fast identification of strong temporal association rules. By combining temporal soft sets with NegNodeset-based frequent item set mining techniques, we develop the negFIN-based soft temporal association rule mining (negFIN-STARM) method to extract strong temporal association rules. Numerical experiments are conducted on commonly used data sets to show the feasibility of our approach. Moreover, comparative analysis demonstrates that the newly proposed method achieves higher execution efficiency than three well-known approaches in the literature.National Natural Science Foundation of China (NSFC) 11301415Shaanxi Provincial Key Research and Development Program 2021SF-480Natural Science Basic Research Plan in Shaanxi Province of China 2018JM105

    A GIS-based multi-criteria evaluation framework for uncertainty reduction in earthquake disaster management using granular computing

    Get PDF
    One of the most important steps in earthquake disaster management is the prediction of probable damages which is called earthquake vulnerability assessment. Earthquake vulnerability assessment is a multicriteria problem and a number of multi-criteria decision making models have been proposed for the problem. Two main sources of uncertainty including uncertainty associated with experts‘ point of views and the one associated with attribute values exist in the earthquake vulnerability assessment problem. If the uncertainty in these two sources is not handled properly the resulted seismic vulnerability map will be unreliable. The main objective of this research is to propose a reliable model for earthquake vulnerability assessment which is able to manage the uncertainty associated with the experts‘ opinions. Granular Computing (GrC) is able to extract a set of if-then rules with minimum incompatibility from an information table. An integration of Dempster-Shafer Theory (DST) and GrC is applied in the current research to minimize the entropy in experts‘ opinions. The accuracy of the model based on the integration of the DST and GrC is 83%, while the accuracy of the single-expert model is 62% which indicates the importance of uncertainty management in seismic vulnerability assessment problem. Due to limited accessibility to current data, only six criteria are used in this model. However, the model is able to take into account both qualitative and quantitative criteria

    A Two-Stage Optimization Strategy for Fuzzy Object-Based Analysis Using Airborne LiDAR and High-Resolution Orthophotos for Urban Road Extraction

    Full text link
    Copyright © 2017 Maher Ibrahim Sameen and Biswajeet Pradhan. In the last decade, object-based image analysis (OBIA) has been extensively recognized as an effective classification method for very high spatial resolution images or integrated data from different sources. In this study, a two-stage optimization strategy for fuzzy object-based analysis using airborne LiDAR was proposed for urban road extraction. The method optimizes the two basic steps of OBIA, namely, segmentation and classification, to realize accurate land cover mapping and urban road extraction. This objective was achieved by selecting the optimum scale parameter to maximize class separability and the optimum shape and compactness parameters to optimize the final image segments. Class separability was maximized using the Bhattacharyya distance algorithm, whereas image segmentation was optimized using the Taguchi method. The proposed fuzzy rules were created based on integrated data and expert knowledge. Spectral, spatial, and texture features were used under fuzzy rules by implementing the particle swarm optimization technique. The proposed fuzzy rules were easy to implement and were transferable to other areas. An overall accuracy of 82% and a kappa index of agreement (KIA) of 0.79 were achieved on the studied area when results were compared with reference objects created via manual digitization in a geographic information system. The accuracy of road extraction using the developed fuzzy rules was 0.76 (producer), 0.85 (user), and 0.72 (KIA). Meanwhile, overall accuracy was decreased by approximately 6% when the rules were applied on a test site. A KIA of 0.70 was achieved on the test site using the same rules without any changes. The accuracy of the extracted urban roads from the test site was 0.72 (KIA), which decreased to approximately 0.16. Spatial information (i.e., elongation) and intensity from LiDAR were the most interesting properties for urban road extraction. The proposed method can be applied to a wide range of real applications through remote sensing by transferring object-based rules to other areas using optimization techniques
    • …
    corecore