2,905 research outputs found

    ARIADNE: A Research Infrastructure for Archaeology

    Get PDF
    Research e-infrastructures, digital archives, and data services have become important pillars of scientific enterprise that in recent decades have become ever more collaborative, distributed, and data intensive. The archaeological research community has been an early adopter of digital tools for data acquisition, organization, analysis, and presentation of research results of individual projects. However, the provision of e-infrastructure and services for data sharing, discovery, access, and (re)use have lagged behind. This situation is being addressed by ARIADNE, the Advanced Research Infrastructure for Archaeological Dataset Networking in Europe. This EU-funded network has developed an e-infrastructure that enables data providers to register and provide access to their resources (datasets, collections) through the ARIADNE data portal, facilitating discovery, access, and other services across the integrated resources. This article describes the current landscape of data repositories and services for archaeologists in Europe, and the issues that make interoperability between them difficult to realize. The results of the ARIADNE surveys on users’ expectations and requirements are also presented. The main section of the article describes the architecture of the e-infrastructure, core services (data registration, discovery, and access), and various other extant or experimental services. The ongoing evaluation of the data integration and services is also discussed. Finally, the article summarizes lessons learned and outlines the prospects for the wider engagement of the archaeological research community in the sharing of data through ARIADNE

    Geospatial Data Management Research: Progress and Future Directions

    Get PDF
    Without geospatial data management, today´s challenges in big data applications such as earth observation, geographic information system/building information modeling (GIS/BIM) integration, and 3D/4D city planning cannot be solved. Furthermore, geospatial data management plays a connecting role between data acquisition, data modelling, data visualization, and data analysis. It enables the continuous availability of geospatial data and the replicability of geospatial data analysis. In the first part of this article, five milestones of geospatial data management research are presented that were achieved during the last decade. The first one reflects advancements in BIM/GIS integration at data, process, and application levels. The second milestone presents theoretical progress by introducing topology as a key concept of geospatial data management. In the third milestone, 3D/4D geospatial data management is described as a key concept for city modelling, including subsurface models. Progress in modelling and visualization of massive geospatial features on web platforms is the fourth milestone which includes discrete global grid systems as an alternative geospatial reference framework. The intensive use of geosensor data sources is the fifth milestone which opens the way to parallel data storage platforms supporting data analysis on geosensors. In the second part of this article, five future directions of geospatial data management research are presented that have the potential to become key research fields of geospatial data management in the next decade. Geo-data science will have the task to extract knowledge from unstructured and structured geospatial data and to bridge the gap between modern information technology concepts and the geo-related sciences. Topology is presented as a powerful and general concept to analyze GIS and BIM data structures and spatial relations that will be of great importance in emerging applications such as smart cities and digital twins. Data-streaming libraries and “in-situ” geo-computing on objects executed directly on the sensors will revolutionize geo-information science and bridge geo-computing with geospatial data management. Advanced geospatial data visualization on web platforms will enable the representation of dynamically changing geospatial features or moving objects’ trajectories. Finally, geospatial data management will support big geospatial data analysis, and graph databases are expected to experience a revival on top of parallel and distributed data stores supporting big geospatial data analysis

    CHALLENGES AND OPPORTUNITIES FOR THE IMPLEMENTATION OF H-BIM WITH REGARDS TO HISTORICAL INFRASTRUCTURES: A CASE STUDY OF THE PONTE GIORGINI IN CASTIGLIONE DELLA PESCAIA (GROSSETO – ITALY)

    Get PDF
    Historical Building Information Modeling (H-BIM) has been widely documented in literature and is becoming more popular with government bodies, who are increasingly choosing to make its use mandatory in public procurements and contracts. Although the system seems to be one of the best approaches for managing data and driving the decision-making process, several difficulties arise due to the amount of effort required in the initial phases, when the data derived from a geometrical survey must be converted into parametric elements. Moreover, users must decide on a “level of geometrical simplification” a long time in advance, and this inevitably leads to a loss of geometrical data. From this perspective, our research describes a procedure to optimize the workflow of information for existing artefacts, in order to achieve a “lean” H-BIM. In this article, we will analyse two aspects: the first relates to the level of accuracy in a digital model created from the two different point clouds achieved from laser scanner and form images, while the second concerns the conversion of this information into parametric elements (Building Object Models- BOMs) that need to have specific characteristics. The case study we are presenting is the “Ponte Giorgini” (“Giorgini Bridge”) in Castiglione della Pescaia (Grosseto – Italy)

    Ontology-Based Consistent Specification of Sensor Data Acquisition Plans in Cross-Domain IoT Platforms

    Get PDF
    Nowadays there is an high number of IoT applications that seldom can interact with each other because developed within different Vertical IoT Platforms that adopt different standards. Several efforts are devoted to the construction of cross-layered frameworks that facilitate the interoperability among cross-domain IoT platforms for the development of horizontal applications. Even if their realization poses different challenges across all layers of the network stack, in this paper we focus on the interoperability issues that arise at the data management layer. Specifically, starting from a flexible multi-granular Spatio-Temporal-Thematic data model according to which events generated by different kinds of sensors can be represented, we propose a Semantic Virtualization approach according to which the sensors belonging to different IoT platforms and the schema of the produced event streams are described in a Domain Ontology, obtained through the extension of the well-known Semantic Sensor Network ontology. Then, these sensors can be exploited for the creation of Data Acquisition Plans by means of which the streams of events can be filtered, merged, and aggregated in a meaningful way. A notion of consistency is introduced to bind the output streams of the services contained in the Data Acquisition Plan with the Domain Ontology in order to provide a semantic description of its final output. When these plans meet the consistency constraints, it means that the data they handle are well described at the Ontological level and thus the data acquisition process over passed the interoperability barriers occurring in the original sources. The facilities of the StreamLoader prototype are finally presented for supporting the user in the Semantic Virtualization process and for the construction of meaningful Data Acquisition Plans

    Spatial ontologies for architectural heritage

    Get PDF
    Informatics and artificial intelligence have generated new requirements for digital archiving, information, and documentation. Semantic interoperability has become fundamental for the management and sharing of information. The constraints to data interpretation enable both database interoperability, for data and schemas sharing and reuse, and information retrieval in large datasets. Another challenging issue is the exploitation of automated reasoning possibilities. The solution is the use of domain ontologies as a reference for data modelling in information systems. The architectural heritage (AH) domain is considered in this thesis. The documentation in this field, particularly complex and multifaceted, is well-known to be critical for the preservation, knowledge, and promotion of the monuments. For these reasons, digital inventories, also exploiting standards and new semantic technologies, are developed by international organisations (Getty Institute, ONU, European Union). Geometric and geographic information is essential part of a monument. It is composed by a number of aspects (spatial, topological, and mereological relations; accuracy; multi-scale representation; time; etc.). Currently, geomatics permits the obtaining of very accurate and dense 3D models (possibly enriched with textures) and derived products, in both raster and vector format. Many standards were published for the geographic field or in the cultural heritage domain. However, the first ones are limited in the foreseen representation scales (the maximum is achieved by OGC CityGML), and the semantic values do not consider the full semantic richness of AH. The second ones (especially the core ontology CIDOC – CRM, the Conceptual Reference Model of the Documentation Commettee of the International Council of Museums) were employed to document museums’ objects. Even if it was recently extended to standing buildings and a spatial extension was included, the integration of complex 3D models has not yet been achieved. In this thesis, the aspects (especially spatial issues) to consider in the documentation of monuments are analysed. In the light of them, the OGC CityGML is extended for the management of AH complexity. An approach ‘from the landscape to the detail’ is used, for considering the monument in a wider system, which is essential for analysis and reasoning about such complex objects. An implementation test is conducted on a case study, preferring open source applications

    Smart data management with BIM for Architectural Heritage

    Get PDF
    In the last years smart buildings topic has received much attention as well as Building Information Modelling (BIM) and interoperability as independent fields. Linking these topics is an essential research target to help designers and stakeholders to run processes more efficiently. Working on a smart building requires the use of Innovation and Communication Technology (ICT) to optimize design, construction and management. In these terms, several technologies such as sensors for remote monitoring and control, building equipment, management software, etc. are available in the market. As BIM provides an enormous amount of information in its database and theoretically it is able to work with all kind of data sources using interoperability, it is essential to define standards for both data contents and format exchange. In this way, a possibility to align research activity with Horizon 2020 is the investigation of energy saving using ICT. Unfortunately, comparing the Architecture Engineering and Construction (AEC) Industry with other sectors it is clear how in the building field advanced information technology applications have not been adopted yet. However in the last years, the adoption of new methods for the data management has been investigated by many researchers. So, basing on the above considerations, the main purpose of this thesis is investigate the use of BIM methodology relating to existing buildings concerning on three main topics: • Smart data management for architectural heritage preservation; • District data management for energy reduction; • The maintenance of highrises. For these reasons, data management acquires a very important value relating to the optimization of the building process and it is considered the most important goal for this research. Taking into account different kinds of architectural heritage, the attention is focused on the existing and historical buildings that usually have characterized by several constraints. Starting from data collection, a BIM model was developed and customized in function of its objectives, and providing information for different simulation tests. Finally, data visualization was investigated through the Virtual Reality(VR) and Augmented Reality (AR). Certainly, the creation of a 3D parametric model implies that data is organized according to the use of individual users that are involved in the building process. This means that each 3D model can be developed with different Levels of Detail/Development (LODs) basing on the goal of the data source. Along this thesis the importance of LODs is taken into account related to the kind of information filled in a BIM model. In fact, basing on the objectives of each project a BIM model can be developed in a different way to facilitate the querying data for the simulations tests.\ud The three topics were compared considering each step of the building process workflow, highlighting the main differences, evaluating the strengths and weaknesses of BIM methodology. In these terms, the importance to set a BIM template before the modelling step was pointed out, because it provides the possibility to manage information in order to be collected and extracted for different purposes and by specific users. Moreover, basing on the results obtained in terms of the 3D parametric model and in terms of process, a proper BIM maturity level was determined for each topic. Finally, the value of interoperability was arisen from these tests considering that it provided the opportunity to develop a framework for collaboration, involving all parties of the building industry

    Report of the Stanford Linked Data Workshop

    No full text
    The Stanford University Libraries and Academic Information Resources (SULAIR) with the Council on Library and Information Resources (CLIR) conducted at week-long workshop on the prospects for a large scale, multi-national, multi-institutional prototype of a Linked Data environment for discovery of and navigation among the rapidly, chaotically expanding array of academic information resources. As preparation for the workshop, CLIR sponsored a survey by Jerry Persons, Chief Information Architect emeritus of SULAIR that was published originally for workshop participants as background to the workshop and is now publicly available. The original intention of the workshop was to devise a plan for such a prototype. However, such was the diversity of knowledge, experience, and views of the potential of Linked Data approaches that the workshop participants turned to two more fundamental goals: building common understanding and enthusiasm on the one hand and identifying opportunities and challenges to be confronted in the preparation of the intended prototype and its operation on the other. In pursuit of those objectives, the workshop participants produced:1. a value statement addressing the question of why a Linked Data approach is worth prototyping;2. a manifesto for Linked Libraries (and Museums and Archives and …);3. an outline of the phases in a life cycle of Linked Data approaches;4. a prioritized list of known issues in generating, harvesting & using Linked Data;5. a workflow with notes for converting library bibliographic records and other academic metadata to URIs;6. examples of potential “killer apps” using Linked Data: and7. a list of next steps and potential projects.This report includes a summary of the workshop agenda, a chart showing the use of Linked Data in cultural heritage venues, and short biographies and statements from each of the participants

    The INCF Digital Atlasing Program: Report on Digital Atlasing Standards in the Rodent Brain

    Get PDF
    The goal of the INCF Digital Atlasing Program is to provide the vision and direction necessary to make the rapidly growing collection of multidimensional data of the rodent brain (images, gene expression, etc.) widely accessible and usable to the international research community. This Digital Brain Atlasing Standards Task Force was formed in May 2008 to investigate the state of rodent brain digital atlasing, and formulate standards, guidelines, and policy recommendations.

Our first objective has been the preparation of a detailed document that includes the vision and specific description of an infrastructure, systems and methods capable of serving the scientific goals of the community, as well as practical issues for achieving
the goals. This report builds on the 1st INCF Workshop on Mouse and Rat Brain Digital Atlasing Systems (Boline et al., 2007, _Nature Preceedings_, doi:10.1038/npre.2007.1046.1) and includes a more detailed analysis of both the current state and desired state of digital atlasing along with specific recommendations for achieving these goals
    • …
    corecore