1,774 research outputs found

    Integration of linked open data in case-based reasoning systems

    Get PDF
    This paper discusses the opportunities of integrating Linked Open Data (LOD) resources into Case-Based Reasoning (CBR) systems. Upon the application domain travel medicine, we will exemplify how LOD can be used to fill three out of four knowledge containers a CBR system is based on. The paper also presents the applied techniques for the realization and demonstrates the performance gain of knowledge acquisition by the use of LOD

    Semantic data mining and linked data for a recommender system in the AEC industry

    Get PDF
    Even though it can provide design teams with valuable performance insights and enhance decision-making, monitored building data is rarely reused in an effective feedback loop from operation to design. Data mining allows users to obtain such insights from the large datasets generated throughout the building life cycle. Furthermore, semantic web technologies allow to formally represent the built environment and retrieve knowledge in response to domain-specific requirements. Both approaches have independently established themselves as powerful aids in decision-making. Combining them can enrich data mining processes with domain knowledge and facilitate knowledge discovery, representation and reuse. In this article, we look into the available data mining techniques and investigate to what extent they can be fused with semantic web technologies to provide recommendations to the end user in performance-oriented design. We demonstrate an initial implementation of a linked data-based system for generation of recommendations

    LODE: Linking Digital Humanities Content to the Web of Data

    Full text link
    Numerous digital humanities projects maintain their data collections in the form of text, images, and metadata. While data may be stored in many formats, from plain text to XML to relational databases, the use of the resource description framework (RDF) as a standardized representation has gained considerable traction during the last five years. Almost every digital humanities meeting has at least one session concerned with the topic of digital humanities, RDF, and linked data. While most existing work in linked data has focused on improving algorithms for entity matching, the aim of the LinkedHumanities project is to build digital humanities tools that work "out of the box," enabling their use by humanities scholars, computer scientists, librarians, and information scientists alike. With this paper, we report on the Linked Open Data Enhancer (LODE) framework developed as part of the LinkedHumanities project. With LODE we support non-technical users to enrich a local RDF repository with high-quality data from the Linked Open Data cloud. LODE links and enhances the local RDF repository without compromising the quality of the data. In particular, LODE supports the user in the enhancement and linking process by providing intuitive user-interfaces and by suggesting high-quality linking candidates using tailored matching algorithms. We hope that the LODE framework will be useful to digital humanities scholars complementing other digital humanities tools

    FROM THE ELABORATION PROCESS OF POINT CLOUD TO INFORMATION SYSTEMS BOTH FOR PLANNING AND DESIGN MANAGEMENT OF CULTURAL HERITAGE

    Get PDF
    Abstract. Nowadays we are able to produce geometric models of historical building at different scale of detail using photos and measurements. More and more we are facing with lack of preservation actions and maintenance activities, bad foreseen policies, unexpected natural events, that are forcing professionals and researchers to operate without usual data. In these cases, we need consistent repository to collect and distribute data to produce information. Furthermore, we need to "give intelligence" to these repositories in order to query them with respect geometrical instances, topological issues, historical features.We dispose of tons of xyz points: how can we pass from the point cloud to a building information model, then to a geographic information system, not necessarily in this order? A simple Scan-to-BIM-to-GIS and Scan-to-GIS-to-BIM process were tested in order to consequently evaluate, with purposes of preservation and of enhancing of resilience, some practices that could became the best, also in terms of time and cost saving.The work we propose is a part of an ongoing research focused on the application of H-BIM approach for the management of historical building heritage, focused on a district management (H-DIM, at an urban level). In particular, with regard to the resilience theme, both the acquisition phase and the archive research process are of great importance for protecting our undefended building heritage.Regarding the case study of the paper, UNESCO sites represent important areas for collective interests of humanity. This contribution proposes a possible solution applying a digital cultural heritage to the historical part of the Municipality of Serralunga d'Alba belonging to the UNESCO site called Vineyard Landscape of Langhe-Roero and Monferrato.</p

    Partout: A Distributed Engine for Efficient RDF Processing

    Full text link
    The increasing interest in Semantic Web technologies has led not only to a rapid growth of semantic data on the Web but also to an increasing number of backend applications with already more than a trillion triples in some cases. Confronted with such huge amounts of data and the future growth, existing state-of-the-art systems for storing RDF and processing SPARQL queries are no longer sufficient. In this paper, we introduce Partout, a distributed engine for efficient RDF processing in a cluster of machines. We propose an effective approach for fragmenting RDF data sets based on a query log, allocating the fragments to nodes in a cluster, and finding the optimal configuration. Partout can efficiently handle updates and its query optimizer produces efficient query execution plans for ad-hoc SPARQL queries. Our experiments show the superiority of our approach to state-of-the-art approaches for partitioning and distributed SPARQL query processing

    Stigmergic hyperlink's contributes to web search

    Get PDF
    Stigmergic hyperlinks are hyperlinks with a "heart beat": if used they stay healthy and online; if neglected, they fade, eventually getting replaced. Their life attribute is a relative usage measure that regular hyperlinks do not provide, hence PageRank-like measures have historically been well informed about the structure of webs of documents, but unaware of what users effectively do with the links. This paper elaborates on how to input the users’ perspective into Google’s original, structure centric, PageRank metric. The discussion then bridges to the Deep Web, some search challenges, and how stigmergic hyperlinks could help decentralize the search experience, facilitating user generated search solutions and supporting new related business models.info:eu-repo/semantics/publishedVersio
    • …
    corecore