3,376 research outputs found

    A Survey of Volunteered Open Geo-Knowledge Bases in the Semantic Web

    Full text link
    Over the past decade, rapid advances in web technologies, coupled with innovative models of spatial data collection and consumption, have generated a robust growth in geo-referenced information, resulting in spatial information overload. Increasing 'geographic intelligence' in traditional text-based information retrieval has become a prominent approach to respond to this issue and to fulfill users' spatial information needs. Numerous efforts in the Semantic Geospatial Web, Volunteered Geographic Information (VGI), and the Linking Open Data initiative have converged in a constellation of open knowledge bases, freely available online. In this article, we survey these open knowledge bases, focusing on their geospatial dimension. Particular attention is devoted to the crucial issue of the quality of geo-knowledge bases, as well as of crowdsourced data. A new knowledge base, the OpenStreetMap Semantic Network, is outlined as our contribution to this area. Research directions in information integration and Geographic Information Retrieval (GIR) are then reviewed, with a critical discussion of their current limitations and future prospects

    Semantically-Enriched Search Engine for Geoportals: A Case Study with ArcGIS Online

    Full text link
    Many geoportals such as ArcGIS Online are established with the goal of improving geospatial data reusability and achieving intelligent knowledge discovery. However, according to previous research, most of the existing geoportals adopt Lucene-based techniques to achieve their core search functionality, which has a limited ability to capture the user's search intentions. To better understand a user's search intention, query expansion can be used to enrich the user's query by adding semantically similar terms. In the context of geoportals and geographic information retrieval, we advocate the idea of semantically enriching a user's query from both geospatial and thematic perspectives. In the geospatial aspect, we propose to enrich a query by using both place partonomy and distance decay. In terms of the thematic aspect, concept expansion and embedding-based document similarity are used to infer the implicit information hidden in a user's query. This semantic query expansion 1 2 G. Mai et al. framework is implemented as a semantically-enriched search engine using ArcGIS Online as a case study. A benchmark dataset is constructed to evaluate the proposed framework. Our evaluation results show that the proposed semantic query expansion framework is very effective in capturing a user's search intention and significantly outperforms a well-established baseline-Lucene's practical scoring function-with more than 3.0 increments in DCG@K (K=3,5,10).Comment: 18 pages; Accepted to AGILE 2020 as a full paper GitHub Code Repository: https://github.com/gengchenmai/arcgis-online-search-engin

    Towards automated knowledge-based mapping between individual conceptualisations to empower personalisation of Geospatial Semantic Web

    No full text
    Geospatial domain is characterised by vagueness, especially in the semantic disambiguation of the concepts in the domain, which makes defining universally accepted geo- ontology an onerous task. This is compounded by the lack of appropriate methods and techniques where the individual semantic conceptualisations can be captured and compared to each other. With multiple user conceptualisations, efforts towards a reliable Geospatial Semantic Web, therefore, require personalisation where user diversity can be incorporated. The work presented in this paper is part of our ongoing research on applying commonsense reasoning to elicit and maintain models that represent users' conceptualisations. Such user models will enable taking into account the users' perspective of the real world and will empower personalisation algorithms for the Semantic Web. Intelligent information processing over the Semantic Web can be achieved if different conceptualisations can be integrated in a semantic environment and mismatches between different conceptualisations can be outlined. In this paper, a formal approach for detecting mismatches between a user's and an expert's conceptual model is outlined. The formalisation is used as the basis to develop algorithms to compare models defined in OWL. The algorithms are illustrated in a geographical domain using concepts from the SPACE ontology developed as part of the SWEET suite of ontologies for the Semantic Web by NASA, and are evaluated by comparing test cases of possible user misconceptions

    An Ontology-Based Method for Semantic Integration of Business Components

    Full text link
    Building new business information systems from reusable components is today an approach widely adopted and used. Using this approach in analysis and design phases presents a great interest and requires the use of a particular class of components called Business Components (BC). Business Components are today developed by several manufacturers and are available in many repositories. However, reusing and integrating them in a new Information System requires detection and resolution of semantic conflicts. Moreover, most of integration and semantic conflict resolution systems rely on ontology alignment methods based on domain ontology. This work is positioned at the intersection of two research areas: Integration of reusable Business Components and alignment of ontologies for semantic conflict resolution. Our contribution concerns both the proposal of a BC integration solution based on ontologies alignment and a method for enriching the domain ontology used as a support for alignment.Comment: IEEE New Technologies of Distributed Systems (NOTERE), 2011 11th Annual International Conference; ISSN: 2162-1896 Print ISBN: 978-1-4577-0729-2 INSPEC Accession Number: 12122775 201

    Software tools for conducting bibliometric analysis in science: An up-to-date review

    Get PDF
    Bibliometrics has become an essential tool for assessing and analyzing the output of scientists, cooperation between universities, the effect of state-owned science funding on national research and development performance and educational efficiency, among other applications. Therefore, professionals and scientists need a range of theoretical and practical tools to measure experimental data. This review aims to provide an up-to-date review of the various tools available for conducting bibliometric and scientometric analyses, including the sources of data acquisition, performance analysis and visualization tools. The included tools were divided into three categories: general bibliometric and performance analysis, science mapping analysis, and libraries; a description of all of them is provided. A comparative analysis of the database sources support, pre-processing capabilities, analysis and visualization options were also provided in order to facilitate its understanding. Although there are numerous bibliometric databases to obtain data for bibliometric and scientometric analysis, they have been developed for a different purpose. The number of exportable records is between 500 and 50,000 and the coverage of the different science fields is unequal in each database. Concerning the analyzed tools, Bibliometrix contains the more extensive set of techniques and suitable for practitioners through Biblioshiny. VOSviewer has a fantastic visualization and is capable of loading and exporting information from many sources. SciMAT is the tool with a powerful pre-processing and export capability. In views of the variability of features, the users need to decide the desired analysis output and chose the option that better fits into their aims

    Join operation for semantic data enrichment of asynchronous time series data

    Get PDF
    In this paper, we present a novel framework for enriching time series data in smart cities by supplementing it with information from external sources via semantic data enrichment. Our methodology effectively merges multiple data sources into a uniform time series, while addressing difficulties such as data quality, contextual information, and time lapses. We demonstrate the efficacy of our method through a case study in Barcelona, which permitted the use of advanced analysis methods such as windowed cross-correlation and peak picking. The resulting time series data can be used to determine traffic patterns and has potential uses in other smart city sectors, such as air quality, energy efficiency, and public safety. Interactive dashboards enable stakeholders to visualize and summarize key insights and patterns.Postprint (published version

    Microtheories for SDI - Accounting for diversity of local conceptualisations at a global level

    Get PDF
    Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.The categorization and conceptualization of geographic features is fundamental to cartography, geographic information retrieval, routing applications, spatial decision support and data sharing in general. However, there is no standard conceptualization of the world. Humans conceptualize features based on numerous factors including cultural background, knowledge, motivation and particularly space and time. Thus, geographic features are prone to multiple, context-dependent conceptualizations reflecting local conditions. This creates semantic heterogeneity and undermines interoperability. Standardization of a shared definition is often employed to overcome semantic heterogeneity. However, this approach loses important local diversity in feature conceptualizations and may result in feature definitions which are too broad or too specific. This work proposes the use of microtheories in Spatial Data Infrastructures, such as INSPIRE, to account for diversity of local conceptualizations while maintaining interoperability at a global level. It introduces a novel method of structuring microtheories based on space and time, represented by administrative boundaries, to reflect variations in feature conceptualization. A bottom-up approach, based on non-standard inference, is used to create an appropriate global-level feature definition from the local definitions. Conceptualizations of rivers, forests and estuaries throughout Europe are used to demonstrate how the approach can improve the INSPIRE data model and ease its adoption by European member states
    • …
    corecore