903 research outputs found

    Digital Preservation Services : State of the Art Analysis

    Get PDF
    Research report funded by the DC-NET project.An overview of the state of the art in service provision for digital preservation and curation. Its focus is on the areas where bridging the gaps is needed between e-Infrastructures and efficient and forward-looking digital preservation services. Based on a desktop study and a rapid analysis of some 190 currently available tools and services for digital preservation, the deliverable provides a high-level view on the range of instruments currently on offer to support various functions within a preservation system.European Commission, FP7peer-reviewe

    DASCH: Data and Service Center for the Humanities

    Get PDF
    Research data in the humanities needs to be sustainable, and access to digital resources must be possible over a long period. Only if these prerequisites are fulfilled can research data be used as a source for other projects. In addition, reliability is a fundamental requirement so that digital sources can be cited, reused, and quoted. To address this problem, we present our solution: the Data and Service Center for the Humanities located in Switzerland. The centralized infrastructure is based on flexible and extendable software that is in turn reliant on modern technologies. Such an approach allows for the straightforward migration of existing research project databases with limited life spans in the humanities. We will demonstrate the basic concepts behind this proposed solution and our first experiences in the application thereof

    The application of workflows to digital heritage systems

    Get PDF
    Digital heritage systems usually handle a rich and varied mix of digital objects, accompanied by complex and intersecting workflows and processes. However, they usually lack effective workflow management within their components as evident in the lack of integrated solutions that include workflow components. There are a number of reasons for this limitation in workflow management utilization including some technical challenges, the unique nature of each digital resource and the challenges imposed by the environments and infrastructure in which such systems operate. This thesis investigates the concept of utilizing Workflow Management Systems (WfMS) within Digital Library Systems, and more specifically in online Digital Heritage Resources. The research work conducted involved the design and development of a novel experimental WfMS to test the viability of effective workflow management on the complex processes that exist in digital library and heritage resources. This rarely studied area of interest is covered by analyzing evolving workflow management technologies and paradigms. The different operational and technological aspects of these systems are evaluated while focusing on the areas that traditional systems often fail to address. A digital heritage resource was created to test a novel concept called DISPLAYS (Digital Library Services for Playing with Antiquity and Shared Heritage), which provides digital heritage content: creation, archival, exposition, presentation and interaction services for digital heritage collections. Based on DISPLAYS, a specific digital heritage resource was created to validate its concept and, more importantly, to act as a test bed to validate workflow management for digital heritage resources. This DISPLAYS type system implementation was called the Reanimating Cultural Heritage resource, for which three core components are the archival, retrieval and presentation components. To validate workflow management and its concepts, another limited version of these reanimating cultural heritage components was implemented within a workflow management host to test if the workflow technology is a viable choice for managing control and dataflow within a digital heritage system: this was successfully proved

    Impliance: A Next Generation Information Management Appliance

    Full text link
    ably successful in building a large market and adapting to the changes of the last three decades, its impact on the broader market of information management is surprisingly limited. If we were to design an information management system from scratch, based upon today's requirements and hardware capabilities, would it look anything like today's database systems?" In this paper, we introduce Impliance, a next-generation information management system consisting of hardware and software components integrated to form an easy-to-administer appliance that can store, retrieve, and analyze all types of structured, semi-structured, and unstructured information. We first summarize the trends that will shape information management for the foreseeable future. Those trends imply three major requirements for Impliance: (1) to be able to store, manage, and uniformly query all data, not just structured records; (2) to be able to scale out as the volume of this data grows; and (3) to be simple and robust in operation. We then describe four key ideas that are uniquely combined in Impliance to address these requirements, namely the ideas of: (a) integrating software and off-the-shelf hardware into a generic information appliance; (b) automatically discovering, organizing, and managing all data - unstructured as well as structured - in a uniform way; (c) achieving scale-out by exploiting simple, massive parallel processing, and (d) virtualizing compute and storage resources to unify, simplify, and streamline the management of Impliance. Impliance is an ambitious, long-term effort to define simpler, more robust, and more scalable information systems for tomorrow's enterprises.Comment: This article is published under a Creative Commons License Agreement (http://creativecommons.org/licenses/by/2.5/.) You may copy, distribute, display, and perform the work, make derivative works and make commercial use of the work, but, you must attribute the work to the author and CIDR 2007. 3rd Biennial Conference on Innovative Data Systems Research (CIDR) January 710, 2007, Asilomar, California, US

    Helmholtz Portfolio Theme Large-Scale Data Management and Analysis (LSDMA)

    Get PDF
    The Helmholtz Association funded the "Large-Scale Data Management and Analysis" portfolio theme from 2012-2016. Four Helmholtz centres, six universities and another research institution in Germany joined to enable data-intensive science by optimising data life cycles in selected scientific communities. In our Data Life cycle Labs, data experts performed joint R&D together with scientific communities. The Data Services Integration Team focused on generic solutions applied by several communities

    A Framework for Web Object Self-Preservation

    Get PDF
    We propose and develop a framework based on emergent behavior principles for the long-term preservation of digital data using the web infrastructure. We present the development of the framework called unsupervised small-world (USW) which is at the nexus of emergent behavior, graph theory, and digital preservation. The USW algorithm creates graph based structures on the Web used for preservation of web objects (WOs). Emergent behavior activities, based on Craig Reynolds’ “boids” concept, are used to preserve WOs without the need for a central archiving authority. Graph theory is extended by developing an algorithm that incrementally creates small-world graphs. Graph theory provides a foundation to discuss the vulnerability of graphs to different types of failures and attack profiles. Investigation into the robustness and resilience of USW graphs lead to the development of a metric to quantify the effect of damage inflicted on a graph. The metric remains valid whether the graph is connected or not. Different USW preservation policies are explored within a simulation environment where preservation copies have to be spread across hosts. Spreading the copies across hosts helps to ensure that copies will remain available even when there is a concerted effort to remove all copies of a USW component. A moderately aggressive preservation policy is the most effective at making the best use of host and network resources. Our efforts are directed at answering the following research questions: 1. Can web objects (WOs) be constructed to outlive the people and institutions that created them? We have developed, analyzed, tested through simulations, and developed a reference implementation of the unsupervised small-world (USW) algorithm that we believe will create a connected network of WOs based on the web infrastructure (WI) that will outlive the people and institutions that created the WOs. The USW graph will outlive its creators by being robust and continuing to operate when some of its WOs are lost, and it is resilient and will recover when some of its WOs are lost. 2. Can we leverage aspects of naturally occurring networks and group behavior for preservation? We used Reynolds’ tenets for “boids” to guide our analysis and development of the USW algorithm. The USW algorithm allows a WO to “explore” a portion of the USW graph before making connections to members of the graph and before making preservation copies across the “discovered” graph. Analysis and simulation show that the USW graph has an average path length (L(G)) and clustering coefficient (C(G)) values comparable to small-world graphs. A high C(G) is important because it reflects how likely it is that a WO will be able spread copies to other domains, thereby increasing its likelihood of long term survival. A short L(G) is important because it means that a WO will not have to look too far to identify new candidate preservation domains, if needed. Small-world graphs occur in nature and are thus believed to be robust and resilient. The USW algorithms use these small-world graph characteristics to spread preservation copies across as many hosts as needed and possible. USW graph creation, damage, repair and preservation has been developed and tested in a simulation and reference implementation

    Multi-disciplinary Green IT Archival Analysis: A Pathway for Future Studies

    Get PDF
    With the growth of information technology (IT), there is a growing global concern about the environmental impact of such technologies. As such, academics in several research disciplines consider research on green IT a vibrant theme. While the disparate knowledge in each discipline is gaining substantial momentum, we need a consolidated multi-disciplinary view of the salient findings of each research discipline for green IT research to reach its full potential. We reviewed 390 papers published on green IT from 2007 to 2015 in three disciplines: computer science, information systems and management. The prevailing literature demonstrates the value of this consolidated approach for advancing our understanding on this complex global issue of environmental sustainability. We provide an overarching theoretical perspective to consolidate multi-disciplinary findings and to encourage information systems researchers to develop an effective cumulative tradition of research

    AXMEDIS 2008

    Get PDF
    The AXMEDIS International Conference series aims to explore all subjects and topics related to cross-media and digital-media content production, processing, management, standards, representation, sharing, protection and rights management, to address the latest developments and future trends of the technologies and their applications, impacts and exploitation. The AXMEDIS events offer venues for exchanging concepts, requirements, prototypes, research ideas, and findings which could contribute to academic research and also benefit business and industrial communities. In the Internet as well as in the digital era, cross-media production and distribution represent key developments and innovations that are fostered by emergent technologies to ensure better value for money while optimising productivity and market coverage

    The Inhuman Overhang: On Differential Heterogenesis and Multi-Scalar Modeling

    Get PDF
    As a philosophical paradigm, differential heterogenesis offers us a novel descriptive vantage with which to inscribe Deleuze’s virtuality within the terrain of “differential becoming,” conjugating “pure saliences” so as to parse economies, microhistories, insurgencies, and epistemological evolutionary processes that can be conceived of independently from their representational form. Unlike Gestalt theory’s oppositional constructions, the advantage of this aperture is that it posits a dynamic context to both media and its analysis, rendering them functionally tractable and set in relation to other objects, rather than as sedentary identities. Surveying the genealogy of differential heterogenesis with particular interest in the legacy of Lautman’s dialectic, I make the case for a reading of the Deleuzean virtual that departs from an event-oriented approach, galvanizing Sarti and Citti’s dynamic a priori vis-à-vis Deleuze’s philosophy of difference. Specifically, I posit differential heterogenesis as frame with which to examine our contemporaneous epistemic shift as it relates to multi-scalar computational modeling while paying particular attention to neuro-inferential modes of inductive learning and homologous cognitive architecture. Carving a bricolage between Mark Wilson’s work on the “greediness of scales” and Deleuze’s “scales of reality”, this project threads between static ecologies and active externalism vis-à-vis endocentric frames of reference and syntactical scaffolding
    corecore