19 research outputs found

    Schema Vacuuming in Temporal Databases

    Get PDF
    Temporal databases facilitate the support of historical information by providing functions for indicating the intervals during which a tuple was applicable (along one or more temporal dimensions). Because data are never deleted, only superceded, temporal databases are inherently append-only resulting, over time, in a large historical sequence of database states. Data vacuuming in temporal databases allows for this sequence to be shortened by strategically, and irrevocably, deleting obsolete data. Schema versioning allows users to maintain a history of database schemata without compromising the semantics of the data or the ability to view data through historical schemata. While the techniques required for data vacuuming in temporal databases have been relatively well covered, the associated area of vacuuming schemata has received less attention. This paper discusses this issue and proposes a mechanism that fits well with existing methods for data vacuuming and schema versioning

    <strong> </strong>Multitemporal conditional schema evolution

    Get PDF

    Paikkatietojärjestelmään perustuvan metsätietojärjestelmän tietomalli.

    Get PDF

    Searching for a Time Ontology for Semantic Web Applications

    Get PDF
    We present our experience on reusing time ontologies in Fund Finder, a semantic web application of the EU Esperonto project. On the one hand, we show a set of time ontologies implemented in different machine readable languages. Such ontologies are analyzed considering a series of features typical of time models (e.g., if they consider different granularities or different time zones). On the other hand, we present the specification of time modeling necessities for Fund Finder. Finally, we choose the ontology which fits best with the specifications

    GEOMATICS FOR EMERGENCY MANAGEMENT PURPOSES:DESIGN OF A GLOBAL GEODATABASE

    Get PDF
    Nowadays, the world is facing disasters on an unprecedented scale: millions of people are affected by natural disasters globally each year and, only in the last decade, more than 80% of all disaster-related deaths were caused by natural hazards. Scientific predictions and evidence indicate that global climate changes are increasing the number of extreme events, creating more frequent and intensified natural hazards such as floods and windstorms. Population growth, urbanization and the inability of poor populations to escape from the vicious cycle of poverty are conditions to foresee that there will most likely be an increase in the number of people who are vulnerable to natural hazards, with a resulting increase of natural disasters and environmental emergencies. In recent years, international preoccupation for disasters and their impacts has intensified and risen closer to the top of the development agenda. For many years, response to disasters was largely confined to emergency relief and short-term life-saving actions. But over the last two decades, the critical importance of disaster preparedness, mitigation, and prevention has been widely recognized. The humanitarian and the United Nations system are therefore called to intensify their efforts to improve their capacity in order to provide support to the countries in need and to be better prepared to intervene. Such request came, amongst others, from the UN General Secretary in various occasions. In the frame of humanitarian operations, the World Food Programme (WFP) of the United Nations is in the front line. The WFP is the biggest UN Agency and responds to more than 120 emergencies per year worldwide. According to the UN reform, WFP is also the leader of logistics for UN and international bodies during emergency response operations. WFP initiated a process to reinforce its capacity to be a leading force in the area of emergency response, improving its Information Management capacity in support to emergency preparedness and response. To do so, an agreement of collaboration with the recently formed Information Technology for Humanitarian Assistance Cooperation and Action (ITHACA) Association has been signed and a joint collaboration started in February 2007. One of the objectives of the collaboration is about the use of Geomatics and Information Technology instruments in the Early Warning and Early Impact analysis field. Many worldwide experiences conducted in this area, show that the use of remote sensing and Geographic Information Systems (GIS) technologies, combined with up-to-date, reliable and easily accessible reference base geographic datasets, constitute the key factor for the success of emergency operations and for developing valuable natural disaster preparedness, mitigation and prevention systems. As a matter of fact, the unique characteristics associated with geographic, or geospatial, information technologies facilitate the integration of scientific, social and economic data through space and time, opening up interesting possibilities for monitoring, assessment and change detection activities, thus enabling better informed interventions in human and natural systems. Besides its proven value, the geospatial information is an expensive resource and needs to be fully utilized to maximize the return on investment required for its generation, management and use. Reuse and sharing of spatial information for multiple purposes is an important approach applied in countries where investment in spatial data collection and in their appropriate management has advanced on the basis of its known asset value. Very substantial economic benefits have been estimated by countries that have moved in the direction of optimizing data reuse. However, it is still relatively easy to find examples of projects and other development activities from around the globe that required expensive recapture of essential spatial data because they were originally captured in unique or non-standard file formats, or perhaps discarded after initial use. Recapture of data has also been undertaken in many cases simply because its prior existence was known only by its originators. The United Nations has not been immune to this problem, both within and between the multitude of entities that make up the Secretariat and its agencies, funds and programmes. Historically, the production and use of geospatial data within the UN entities has been accomplished by its component organizations, according to their individual needs and expertise. This has resulted in multiple efforts, reduced opportunities for sharing and reuse of data, and a unnecessary cost burden for the UN system as a whole. Thus, a framework data development approach has been considered necessary. This has resulted in the proposal that implement an UN Spatial Data Infrastructure (SDI). The term SDI is used to denote the relevant base collection of technologies, policies and institutional arrangements that facilitate the availability of and access to spatial data. A SDI hosts geographic data and attributes, sufficient documentation (metadata), a means to discover, visualize and evaluate the data (catalogues and Web mapping), and some methods to provide access to the geographic data. Beyond this, it will also host additional services or software to support applications of the data. The concept of developing a Spatial Data Infrastructure to fulfil UN data management needs was duly approved by United Nations Geographic Information Working Group (UNGIWG) members in 2005 at their 6th Plenary Meeting in Addis Ababa, in the context of a UN-specific SDI, or UNSDI. The WFP, like all other UN agencies, has been called to develop a Spatial Data Infrastructure, according to the UNGIWG recommendations. Therefore, during the last year the different units of WFP involved in the use of geospatial data worked at defining and implementing a WFP SDI with the aim of contributing at the whole UNSDI project. This effort was coordinated and supported by the ITHACA association. Aim of the study The objective of the conducted research has been to investigate the better solution for collecting and organizing geospatial data within a suitable geodatabase with two main purposes:  to support the WFP SDI effort: the development of consistent reusable themes of base cartographic content, known as Framework, Fundamental or Core Data, is recognized as a main and first ingredient in the construction of a SDI. Therefore, the definition of a geodatabase supporting all the WFP units dealing with GIS and geospatial data can be considered a fundamental and necessary step in the whole complex process of the development of the WFP SDI. Common used data provide key for the integration and, in the context of the SDI implementation, the definition of a Core Data geodatabase can be thought as one instrumentality to help improving interoperability, reducing expenses resulting from the inevitable duplications. Moreover, the major aim of the planned geodatabase is to supply all WFP users of a "minimum spatial dataset" which assures valuable geographic analyses and mapping, in support to decision makers during emergencies operations.  to support all activities carried out by ITHACA: the planned geodatabase must constitute a suitable instrument which realizes the integration and the organization of the large geospatial data needed by all ITHACA units in their activities, allowing their effective distribution, sharing and reuse, avoiding any duplication. Moreover, the implemented solution must also guarantee the correct management and updating of the data, keeping their integrity. Finally, this instrument must also allow the easy and fast sharing of necessary information produced by ITHACA during Early Impact activities with the WFP's users engaged in the emergencies rescue operations. In conclusion, the major expected output of the study carried out, described in this thesis, has been the design and the development of a global database and of related rules and procedures in order to correctly store, manage, and exchange geospatial data needed either by WFP humanitarian workers and ITHACA users. The developed database solution allows integrating and updating globally consistent geographic data coming from different sources in many formats, providing each user with the latest datasets, thus avoiding duplications and mistakes. In methodological terms, the following procedure has been adopted: - defining requirements, identification of all activities supported by the geodatabase, analysis of the data flows expected in all supported activities, examining existing data sources and relevant standards (particularly those proposed by the UNGIWG); - development of the data model. The data model has been shaped according to specific needs and demands of the involved user groups within the different interested organizations. The adopted design techniques do not wander off the techniques proposed in literature for general database design, even if it has been necessary, in some steps, to consider the specific features of geographic data; - geodatabase schema generation and implementation of the defined geographic database model as an ESRI ArcSDE Enterprise Geodatabase based on Oracle 10g as DBM

    Design and Implementation of a Research Data Management System: The CRC/TR32 Project Database (TR32DB)

    Get PDF
    Research data management (RDM) includes all processes and measures which ensure that research data are well-organised, documented, preserved, stored, backed up, accessible, available, and re-usable. Corresponding RDM systems or repositories form the technical framework to support the collection, accurate documentation, storage, back-up, sharing, and provision of research data, which are created in a specific environment, like a research group or institution. The required measures for the implementation of a RDM system vary according to the discipline or purpose of data (re-)use. In the context of RDM, the documentation of research data is an essential duty. This has to be conducted by accurate, standardized, and interoperable metadata to ensure the interpretability, understandability, shareability, and long-lasting usability of the data. RDM is achieving an increasing importance, as digital information increases. New technologies enable to create more digital data, also automatically. Consequently, the volume of digital data, including big data and small data, will approximately double every two years in size. With regard to e-science, this increase of data was entitled and predicted as the data deluge. Furthermore, the paradigm change in science has led to data intensive science. Particularly scientific data that were financed by public funding are significantly demanded to be archived, documented, provided or even open accessible by different policy makers, funding agencies, journals and other institutions. RDM can prevent the loss of data, otherwise around 80-90 % of the generated research data disappear and are not available for re-use or further studies. This will lead to empty archives or RDM systems. The reasons for this course are well known and are of a technical, socio-cultural, and ethical nature, like missing user participation and data sharing knowledge, as well as lack of time or resources. In addition, the fear of exploitation and missing or limited reward for publishing and sharing data has an important role. This thesis presents an approach in handling research data of the collaborative, multidisciplinary, long-term DFG-funded research project Collaborative Research Centre/Transregio 32 (CRC/TR32) “Patterns in Soil-Vegetation-Atmosphere Systems: Monitoring, Modelling, and Data Assimilation”. In this context, a RDM system, the so-called CRC/TR32 project database (TR32DB), was designed and implemented. The TR32DB considers the demands of the project participants (e.g. heterogeneous data from different disciplines with various file sizes) and the requirements of the DFG, as well as general challenges in RDM. For this purpose, a RDM system was established that comprises a well-described self-designed metadata schema, a file-based data storage, a well-elaborated database of metadata, and a corresponding user-friendly web interface. The whole system is developed in close cooperation with the local Regional Computing Centre of the University of Cologne (RRZK), where it is also hosted. The documentation of the research data with accurate metadata is of key importance. For this purpose, an own specific TR32DB Metadata Schema was designed, consisting of multi-level metadata properties. This is distinguished in general and data type specific (e.g. data, publication, report) properties and is developed according to the project background, demands of the various data types, as well as recent associated metadata standards and principles. Consequently, it is interoperable to recent metadata standards, such as the Dublin Core, the DataCite Metadata Schema, as well as core elements of the ISO19115:2003 Metadata Standard and INSPIRE Directive. Furthermore, the schema supports optional, mandatory, and automatically generated metadata properties, as well as it provides predefined, obligatory and self-established controlled vocabulary lists. The integrated mapping to the DataCite Metadata Schema facilitates the simple application of a Digital Object Identifier (DOI) for a dataset. The file-based data storage is organized in a folder system, corresponding to the structure of the CRC/TR32 and additionally distinguishes between several data types (e.g. data, publication, report). It is embedded in the Andrew File System hosted by the RRZK. The file system is capable to store and backup all data, is highly scalable, supports location independence, and enables easy administration by Access Control Lists. In addition, the relational database management system MySQL stores the metadata according to the previous mentioned TR32DB Metadata Schema as well as further necessary administrative data. A user-friendly web-based graphical user interface enables the access to the TR32DB system. The web-interface provides metadata input, search, and download of data, as well as the visualization of important geodata is handled by an internal WebGIS. This web-interface, as well as the entire RDM system, is self-developed and adjusted to the specific demands. Overall, the TR32DB system is developed according to the needs and requirements of the CRC/TR32 scientists, fits the demands of the DFG, and considers general problems and challenges of RDM as well. With regard to changing demands of the CRC/TR32 and technologic advances, the system is and will be consequently further developed. The established TR32DB approach was already successfully applied to another interdisciplinary research project. Thus, this approach is transferable and generally capable to archive all data, generated by the CRC/TR32, with accurately, interoperable metadata to ensure the re-use of the data, beyond the end of the project

    Standardization Definition Document

    Get PDF
    The objective of this document is the definition of a set of cartographic and technical standards and directions to be used, adapted or -in minor form -established for GMAP. Standards proposed and mentioned in the present documents include geologic and cartographic aspects. Some of the proposed directions and standards are initial ones that are planned to be refined and/or updated throughout the Europlanet H2024RI project, to be used within the VA activities and for future sustainable European planetarymapping efforts beyond the RI.The state of the art and relevant documents are included, as well as process-specific and body-specific best practice and exemplary published cases. The approaches for two-dimensional mapping and three-dimensional geologic mapping and modelling are introduced, as well as the range of non-standard map types that are envisaged within GMAP activities. Mapping review directions are indicated, as well data sharing, distribution and discovery.Proposed standards, best practice, andtools are based on existing ones or on additional or new developments and adaptations.Appendices are included and point to either individual developments or external resources and tools that will be maintained throughout the duration of the research infrastructure, and beyond it, through sustainability.The present document is going to be a live document permanently accessible on the GMAP wiki and periodically updated in form of a deliverable
    corecore