85 research outputs found

    Metainformation scenarios in Digital Humanities: Characterization and conceptual modelling strategies

    Get PDF
    Requirements for the analysis, interpretation and reuse of information are becoming more and more ambitious as we generate larger and more complex datasets. This is leading to the development and widespread use of information about information, often called metainformation (or metadata) in most disciplines. The Digital Humanities are not an exception. We often assume that metainformation helps us in documenting information for future reference by recording who has created it, when and how, among other aspects. We also assume that recording metainformation will facilitate the tasks of interpreting information at later stages. However, some works have identified some issues with existing metadata approaches, related to 1) the proliferation of too many “standards” and difficulties to choose between them; 2) the generalized assumption that metadata and data (or metainformation and information) are essentially different, and the subsequent development of separate sets of languages and tools for each (introducing redundant models); and 3) the combination of conceptual and implementation concerns within most approaches, violating basic engineering principles of modularity and separation of concerns. Some of these problems are especially relevant in Digital Humanities. In addition, we argue here that the lack of characterization of the scenarios in which metainformation plays a relevant role in humanistic projects often results in metainformation being recorded and managed without a specific purpose in mind. In turn, this hinders the process of decision making on issues such as what metainformation must be recorded in a specific project, and how it must be conceptualized, stored and managed. This paper presents a review of the most used metadata approaches in Digital Humanities and, taking a conceptual modelling perspective, analyses their major issues as outlined above. It also describes what the most common scenarios for the use of metainformation in Digital Humanities are, presenting a characterization that can assist in the setting of goals for metainformation recording and management in each case. Based on these two aspects, a new approach is proposed for the conceptualization, recording and management of metainformation in the Digital Humanities, using the ConML conceptual modelling language, and adopting the overall view that metainformation is not essentially different to information. The proposal is validated in Digital Humanities scenarios through case studies employing real-world datasetsThis work was partially supported by Spanish Ministry of Economy, Industry and Competitiveness under its Competitive Juan de la Cierva Postdoctoral Research Programme (FJCI-2016-28032)S

    Searching for a Time Ontology for Semantic Web Applications

    Get PDF
    We present our experience on reusing time ontologies in Fund Finder, a semantic web application of the EU Esperonto project. On the one hand, we show a set of time ontologies implemented in different machine readable languages. Such ontologies are analyzed considering a series of features typical of time models (e.g., if they consider different granularities or different time zones). On the other hand, we present the specification of time modeling necessities for Fund Finder. Finally, we choose the ontology which fits best with the specifications

    An ontological model for the reality-based 3D annotation of heritage building conservation state

    Get PDF
    The conservation and restoration of historical monuments require a diagnostic analysis carried out by amultidisciplinary team. The results of the diagnosis include data produced by different techniques andprotocols, which are used by conservation scientists to assess the built heritage. Nowadays, together withthe aforementioned data, a great deal of heterogeneous information is also available, including descriptiveand contextual information, as well as 2D/3D geometrical restitution of the studied object. However, theintegration of these diverse data into a unique information model capable of fully describing the buildingconservation state, as well as integrating future data, is still an open issue within the Cultural Heritagecommunity. It is of paramount importance to correlate these data and spatialize them in order to providescientists in charge of our heritage with a practical and easy means to explore the information usedduring their assessment, as well as a way to record their scientific observation and share them withintheir community of practice. In order to resolve this issue, we developed a correlation pipeline for theintegration of the semantic, spatial and morphological dimension of a built heritage. The pipeline uses anontological model for recording and integrating multidisciplinary observations of the conservation stateinto structural data spatialized into a semantic-aware 3D representation. The pipeline was successfullytested on the Saint Maurice church of Caromb in the south of France, integrating into a unique spatialrepresentation information about material and alteration phenomena, providing users with a means tocorrelate, and more importantly retrieve several types of information

    GEOMATICS FOR EMERGENCY MANAGEMENT PURPOSES:DESIGN OF A GLOBAL GEODATABASE

    Get PDF
    Nowadays, the world is facing disasters on an unprecedented scale: millions of people are affected by natural disasters globally each year and, only in the last decade, more than 80% of all disaster-related deaths were caused by natural hazards. Scientific predictions and evidence indicate that global climate changes are increasing the number of extreme events, creating more frequent and intensified natural hazards such as floods and windstorms. Population growth, urbanization and the inability of poor populations to escape from the vicious cycle of poverty are conditions to foresee that there will most likely be an increase in the number of people who are vulnerable to natural hazards, with a resulting increase of natural disasters and environmental emergencies. In recent years, international preoccupation for disasters and their impacts has intensified and risen closer to the top of the development agenda. For many years, response to disasters was largely confined to emergency relief and short-term life-saving actions. But over the last two decades, the critical importance of disaster preparedness, mitigation, and prevention has been widely recognized. The humanitarian and the United Nations system are therefore called to intensify their efforts to improve their capacity in order to provide support to the countries in need and to be better prepared to intervene. Such request came, amongst others, from the UN General Secretary in various occasions. In the frame of humanitarian operations, the World Food Programme (WFP) of the United Nations is in the front line. The WFP is the biggest UN Agency and responds to more than 120 emergencies per year worldwide. According to the UN reform, WFP is also the leader of logistics for UN and international bodies during emergency response operations. WFP initiated a process to reinforce its capacity to be a leading force in the area of emergency response, improving its Information Management capacity in support to emergency preparedness and response. To do so, an agreement of collaboration with the recently formed Information Technology for Humanitarian Assistance Cooperation and Action (ITHACA) Association has been signed and a joint collaboration started in February 2007. One of the objectives of the collaboration is about the use of Geomatics and Information Technology instruments in the Early Warning and Early Impact analysis field. Many worldwide experiences conducted in this area, show that the use of remote sensing and Geographic Information Systems (GIS) technologies, combined with up-to-date, reliable and easily accessible reference base geographic datasets, constitute the key factor for the success of emergency operations and for developing valuable natural disaster preparedness, mitigation and prevention systems. As a matter of fact, the unique characteristics associated with geographic, or geospatial, information technologies facilitate the integration of scientific, social and economic data through space and time, opening up interesting possibilities for monitoring, assessment and change detection activities, thus enabling better informed interventions in human and natural systems. Besides its proven value, the geospatial information is an expensive resource and needs to be fully utilized to maximize the return on investment required for its generation, management and use. Reuse and sharing of spatial information for multiple purposes is an important approach applied in countries where investment in spatial data collection and in their appropriate management has advanced on the basis of its known asset value. Very substantial economic benefits have been estimated by countries that have moved in the direction of optimizing data reuse. However, it is still relatively easy to find examples of projects and other development activities from around the globe that required expensive recapture of essential spatial data because they were originally captured in unique or non-standard file formats, or perhaps discarded after initial use. Recapture of data has also been undertaken in many cases simply because its prior existence was known only by its originators. The United Nations has not been immune to this problem, both within and between the multitude of entities that make up the Secretariat and its agencies, funds and programmes. Historically, the production and use of geospatial data within the UN entities has been accomplished by its component organizations, according to their individual needs and expertise. This has resulted in multiple efforts, reduced opportunities for sharing and reuse of data, and a unnecessary cost burden for the UN system as a whole. Thus, a framework data development approach has been considered necessary. This has resulted in the proposal that implement an UN Spatial Data Infrastructure (SDI). The term SDI is used to denote the relevant base collection of technologies, policies and institutional arrangements that facilitate the availability of and access to spatial data. A SDI hosts geographic data and attributes, sufficient documentation (metadata), a means to discover, visualize and evaluate the data (catalogues and Web mapping), and some methods to provide access to the geographic data. Beyond this, it will also host additional services or software to support applications of the data. The concept of developing a Spatial Data Infrastructure to fulfil UN data management needs was duly approved by United Nations Geographic Information Working Group (UNGIWG) members in 2005 at their 6th Plenary Meeting in Addis Ababa, in the context of a UN-specific SDI, or UNSDI. The WFP, like all other UN agencies, has been called to develop a Spatial Data Infrastructure, according to the UNGIWG recommendations. Therefore, during the last year the different units of WFP involved in the use of geospatial data worked at defining and implementing a WFP SDI with the aim of contributing at the whole UNSDI project. This effort was coordinated and supported by the ITHACA association. Aim of the study The objective of the conducted research has been to investigate the better solution for collecting and organizing geospatial data within a suitable geodatabase with two main purposes:  to support the WFP SDI effort: the development of consistent reusable themes of base cartographic content, known as Framework, Fundamental or Core Data, is recognized as a main and first ingredient in the construction of a SDI. Therefore, the definition of a geodatabase supporting all the WFP units dealing with GIS and geospatial data can be considered a fundamental and necessary step in the whole complex process of the development of the WFP SDI. Common used data provide key for the integration and, in the context of the SDI implementation, the definition of a Core Data geodatabase can be thought as one instrumentality to help improving interoperability, reducing expenses resulting from the inevitable duplications. Moreover, the major aim of the planned geodatabase is to supply all WFP users of a "minimum spatial dataset" which assures valuable geographic analyses and mapping, in support to decision makers during emergencies operations.  to support all activities carried out by ITHACA: the planned geodatabase must constitute a suitable instrument which realizes the integration and the organization of the large geospatial data needed by all ITHACA units in their activities, allowing their effective distribution, sharing and reuse, avoiding any duplication. Moreover, the implemented solution must also guarantee the correct management and updating of the data, keeping their integrity. Finally, this instrument must also allow the easy and fast sharing of necessary information produced by ITHACA during Early Impact activities with the WFP's users engaged in the emergencies rescue operations. In conclusion, the major expected output of the study carried out, described in this thesis, has been the design and the development of a global database and of related rules and procedures in order to correctly store, manage, and exchange geospatial data needed either by WFP humanitarian workers and ITHACA users. The developed database solution allows integrating and updating globally consistent geographic data coming from different sources in many formats, providing each user with the latest datasets, thus avoiding duplications and mistakes. In methodological terms, the following procedure has been adopted: - defining requirements, identification of all activities supported by the geodatabase, analysis of the data flows expected in all supported activities, examining existing data sources and relevant standards (particularly those proposed by the UNGIWG); - development of the data model. The data model has been shaped according to specific needs and demands of the involved user groups within the different interested organizations. The adopted design techniques do not wander off the techniques proposed in literature for general database design, even if it has been necessary, in some steps, to consider the specific features of geographic data; - geodatabase schema generation and implementation of the defined geographic database model as an ESRI ArcSDE Enterprise Geodatabase based on Oracle 10g as DBM

    Semantic location extraction from crowdsourced data

    Get PDF
    Crowdsourced Data (CSD) has recently received increased attention in many application areas including disaster management. Convenience of production and use, data currency and abundancy are some of the key reasons for attracting this high interest. Conversely, quality issues like incompleteness, credibility and relevancy prevent the direct use of such data in important applications like disaster management. Moreover, location information availability of CSD is problematic as it remains very low in many crowd sourced platforms such as Twitter. Also, this recorded location is mostly related to the mobile device or user location and often does not represent the event location. In CSD, event location is discussed descriptively in the comments in addition to the recorded location (which is generated by means of mobile device's GPS or mobile communication network). This study attempts to semantically extract the CSD location information with the help of an ontological Gazetteer and other available resources. 2011 Queensland flood tweets and Ushahidi Crowd Map data were semantically analysed to extract the location information with the support of Queensland Gazetteer which is converted to an ontological gazetteer and a global gazetteer. Some preliminary results show that the use of ontologies and semantics can improve the accuracy of place name identification of CSD and the process of location information extraction

    Big Data in Bioeconomy

    Get PDF
    This edited open access book presents the comprehensive outcome of The European DataBio Project, which examined new data-driven methods to shape a bioeconomy. These methods are used to develop new and sustainable ways to use forest, farm and fishery resources. As a European initiative, the goal is to use these new findings to support decision-makers and producers – meaning farmers, land and forest owners and fishermen. With their 27 pilot projects from 17 countries, the authors examine important sectors and highlight examples where modern data-driven methods were used to increase sustainability. How can farmers, foresters or fishermen use these insights in their daily lives? The authors answer this and other questions for our readers. The first four parts of this book give an overview of the big data technologies relevant for optimal raw material gathering. The next three parts put these technologies into perspective, by showing useable applications from farming, forestry and fishery. The final part of this book gives a summary and a view on the future. With its broad outlook and variety of topics, this book is an enrichment for students and scientists in bioeconomy, biodiversity and renewable resources

    The Nexus Between Security Sector Governance/Reform and Sustainable Development Goal-16

    Get PDF
    This Security Sector Reform (SSR) Paper offers a universal and analytical perspective on the linkages between Security Sector Governance (SSG)/SSR (SSG/R) and Sustainable Development Goal-16 (SDG-16), focusing on conflict and post-conflict settings as well as transitional and consolidated democracies. Against the background of development and security literatures traditionally maintaining separate and compartmentalized presence in both academic and policymaking circles, it maintains that the contemporary security- and development-related challenges are inextricably linked, requiring effective measures with an accurate understanding of the nature of these challenges. In that sense, SDG-16 is surely a good step in the right direction. After comparing and contrasting SSG/R and SDG-16, this SSR Paper argues that human security lies at the heart of the nexus between the 2030 Agenda of the United Nations (UN) and SSG/R. To do so, it first provides a brief overview of the scholarly and policymaking literature on the development-security nexus to set the background for the adoption of The Agenda 2030. Next, it reviews the literature on SSG/R and SDGs, and how each concept evolved over time. It then identifies the puzzle this study seeks to address by comparing and contrasting SSG/R with SDG-16. After making a case that human security lies at the heart of the nexus between the UN’s 2030 Agenda and SSG/R, this book analyses the strengths and weaknesses of human security as a bridge between SSG/R and SDG-16 and makes policy recommendations on how SSG/R, bolstered by human security, may help achieve better results on the SDG-16 targets. It specifically emphasizes the importance of transparency, oversight, and accountability on the one hand, and participative approach and local ownership on the other. It concludes by arguing that a simultaneous emphasis on security and development is sorely needed for addressing the issues under the purview of SDG-16

    Proceedings of the 10th International Conference on Ecological Informatics: translating ecological data into knowledge and decisions in a rapidly changing world: ICEI 2018

    Get PDF
    The Conference Proceedings are an impressive display of the current scope of Ecological Informatics. Whilst Data Management, Analysis, Synthesis and Forecasting have been lasting popular themes over the past nine biannual ICEI conferences, ICEI 2018 addresses distinctively novel developments in Data Acquisition enabled by cutting edge in situ and remote sensing technology. The here presented ICEI 2018 abstracts captures well current trends and challenges of Ecological Informatics towards: • regional, continental and global sharing of ecological data, • thorough integration of complementing monitoring technologies including DNA-barcoding, • sophisticated pattern recognition by deep learning, • advanced exploration of valuable information in ‘big data’ by means of machine learning and process modelling, • decision-informing solutions for biodiversity conservation and sustainable ecosystem management in light of global changes
    corecore