103,638 research outputs found
Assessing digital preservation frameworks: the approach of the SHAMAN project
How can we deliver infrastructure capable of supporting the
preservation of digital objects, as well as the services that can be applied to those digital objects, in ways that future unknown systems will understand? A critical problem in developing systems is the process of validating whether the delivered solution effectively reflects the validated requirements. This is a challenge also for the EU-funded SHAMAN project, which aims to develop an integrated preservation framework using grid-technologies for distributed networks of digital preservation systems, for managing the storage, access, presentation, and manipulation of digital objects over time. Recognising this, the project team ensured that alongside the user requirements an assessment framework was developed. This paper presents the assessment of the SHAMAN demonstrators for the memory institution, industrial design and engineering and eScience domains, from the point of view of
user’s needs and fitness for purpose. An innovative synergistic use of TRAC criteria, DRAMBORA risk registry and mitigation strategies, iRODS rules and information system models requirements has been designed, with the underlying goal to define associated policies, rules and state information, and make them wherever possible machine-encodable and enforceable. The described assessment framework can be valuable not only for the implementers of this project preservation framework, but for the wider digital preservation community, because it provides a
holistic approach to assessing and validating the preservation of digital libraries, digital repositories and data centres
Education alignment
This essay reviews recent developments in embedding data
management and curation skills into information technology,
library and information science, and research-based
postgraduate courses in various national contexts. The essay
also investigates means of joining up formal education with
professional development training opportunities more
coherently. The potential for using professional internships as a
means of improving communication and understanding between
disciplines is also explored. A key aim of this essay is to identify
what level of complementarity is needed across various
disciplines to most effectively and efficiently support the entire
data curation lifecycle
Digital Preservation, Archival Science and Methodological Foundations for Digital Libraries
Digital libraries, whether commercial, public or personal, lie at the heart of the information society. Yet, research into their long‐term viability and the meaningful accessibility of their contents remains in its infancy. In general, as we have pointed out elsewhere, ‘after more
than twenty years of research in digital curation and preservation the actual theories, methods and technologies that can either foster or ensure digital longevity remain
startlingly limited.’ Research led by DigitalPreservationEurope (DPE) and the Digital
Preservation Cluster of DELOS has allowed us to refine the key research challenges – theoretical, methodological and technological – that need attention by researchers in digital libraries during the coming five to ten years, if we are to ensure that the materials held in our emerging digital libraries are to remain sustainable, authentic, accessible and understandable over time. Building on this work and taking the theoretical framework of archival science as bedrock, this paper investigates digital preservation and its foundational role if digital libraries are to have long‐term viability at the centre of the
global information society.
Recommended from our members
STELLAR (Semantic Technologies Enhancing the Lifecycle of Learning Resources): Jisc Final Report
[Project Summary]
As one of the earliest distance learning providers The Open University (OU) has a rich heritage of archived learning materials. An ever increasing amount of that is in digital form and is being deposited with the University Archive. This growth has been driven by digitisation activity from projects such as AVA (Access to Video Assets) and the Fedora-based Open University Digital Library ‘a place to discover digital and digitised archival content from the OU Library, from videos and images to digitised documents’. Other digital content is being captured from web archiving activities, such as work to preserve Moodle Virtual Learning Environment course websites. An evidence based understanding is required to inform digital preservation policies, curation strategy and investment in digital library development.
Following the Pre-enhancement, Enhancement and Post-enhancement methodology set out by Jisc, STELLAR adopted the model of a balanced scorecard to ascertain the value ascribed to the non-current learning materials. Four aspects were considered: Personal and professional perspectives of value; Value to the Higher Educational and academic communities; Value to internal processes and cultures; Financial perspectives of value. The outcomes of the survey indicated that stakeholders place a high value on the materials, and that they perceived them to have value in all areas evaluated.
Three OU courses were chosen from the digital library for the transformation stage. These materials were enhanced and transformed into RDF, a process that required more extensive metadata expertise and effort than was expected. Following enhancement the RDF was accessed through a tool called DiscOU, created by a member of the project team from the OU’s Knowledge Media Institute. DiscOU uses both linked data and a semantic meaning engine to analyse the meaning of the text in a search query. This is matched against the meaning of the content derived from an index of the full-text of the digital library content.
In the final stage stakeholders were asked through a survey and series of workshops to use the DiscOU proof-of-concept tool to assess their perception of the value of this transformation. This has revealed that overall, academics and other stakeholders in the university do believe that the value of the selected materials was positively impacted by the application of semantic technologies
The Role of Evidence in Establishing Trust in Repositories
This article arises from work by the Digital Curation Centre (DCC) Working Group examining mechanisms to roll out audit and certification services for digital repositories in the United Kingdom. Our attempt to develop a program for applying audit and certification processes and tools took as its starting point the RLG-NARA Audit Checklist for Certifying Digital Repositories. Our intention was to appraise critically the checklist and conceive a means of applying its mechanics within a diverse range of repository environments. We were struck by the realization that while a great deal of effort has been invested in determining the characteristics of a 'trusted digital repository', far less effort has concentrated on the ways in which the presence of the attributes can be demonstrated and their qualities measured. With this in mind we sought to explore the role of evidence within the certification process, and to identify examples of the types of evidence (e.g., documentary, observational, and testimonial) that might be desirable during the course of a repository audit.
Bringing self assessment home: repository profiling and key lines of enquiry within DRAMBORA
Digital repositories are a manifestation of complex organizational, financial, legal, technological, procedural, and political interrelationships. Accompanying each of these are innate uncertainties, exacerbated by the relative immaturity of understanding prevalent within the digital preservation domain. Recent efforts have sought to identify core characteristics that must be demonstrable by successful digital repositories, expressed in the form of check-list documents, intended to support the processes of repository accreditation and certification. In isolation though, the available guidelines lack practical applicability; confusion over evidential requirements and difficulties associated with the diversity that exists among repositories (in terms of mandate, available resources, supported content and legal context) are particularly problematic. A gap exists between the available criteria and the ways and extent to which conformity can be demonstrated. The Digital Repository Audit Method Based on Risk Assessment (DRAMBORA) is a methodology for undertaking repository self assessment, developed jointly by the Digital Curation Centre (DCC) and DigitalPreservationEurope (DPE). DRAMBORA requires repositories to expose their organization, policies and infrastructures to rigorous scrutiny through a series of highly structured exercises, enabling them to build a comprehensive registry of their most pertinent risks, arranged into a structure that facilitates effective management. It draws on experiences accumulated throughout 18 evaluative pilot assessments undertaken in an internationally diverse selection of repositories, digital libraries and data centres (including institutions and services such as the UK National Digital Archive of Datasets, the National Archives of Scotland, Gallica at the National Library of France and the CERN Document Server). Other organizations, such as the British Library, have been using sections of DRAMBORA within their own risk assessment procedures.
Despite the attractive benefits of a bottom up approach, there are implicit challenges posed by neglecting a more objective perspective. Following a sustained period of pilot audits undertaken by DPE, DCC and the DELOS Digital Preservation Cluster aimed at evaluating DRAMBORA, it was stated that had respective project members not been present to facilitate each assessment, and contribute their objective, external perspectives, the results may have been less useful. Consequently, DRAMBORA has developed in a number of ways, to enable knowledge transfer from the responses of comparable repositories, and incorporate more opportunities for structured question sets, or key lines of enquiry, that provoke more comprehensive awareness of the applicability of particular threats and opportunities
Medical Cyber-Physical Systems Development: A Forensics-Driven Approach
The synthesis of technology and the medical industry has partly contributed
to the increasing interest in Medical Cyber-Physical Systems (MCPS). While
these systems provide benefits to patients and professionals, they also
introduce new attack vectors for malicious actors (e.g. financially-and/or
criminally-motivated actors). A successful breach involving a MCPS can impact
patient data and system availability. The complexity and operating requirements
of a MCPS complicates digital investigations. Coupling this information with
the potentially vast amounts of information that a MCPS produces and/or has
access to is generating discussions on, not only, how to compromise these
systems but, more importantly, how to investigate these systems. The paper
proposes the integration of forensics principles and concepts into the design
and development of a MCPS to strengthen an organization's investigative
posture. The framework sets the foundation for future research in the
refinement of specific solutions for MCPS investigations.Comment: This is the pre-print version of a paper presented at the 2nd
International Workshop on Security, Privacy, and Trustworthiness in Medical
Cyber-Physical Systems (MedSPT 2017
Transformative Effects of NDIIPP, the Case of the Henry A. Murray Archive
This article comprises reflections on the changes to the Henry A.
Murray Research Archive, catalyzed by involvement with the National
Digital Information Infrastructure and Preservation Program
(NDIIPP) partnership, and the accompanying introduction of next
generation digital library software.
Founded in 1976 at Radcliffe, the Henry A. Murray Research
Archive is the endowed, permanent repository for quantitative and
qualitative research data at the Institute for Quantitative Social Science,
in Harvard University. The Murray preserves in perpetuity all
types of data of interest to the research community, including numerical,
video, audio, interview notes, and other types. The center
is unique among data archives in the United States in the extent
of its holdings in quantitative, qualitative, and mixed quantitativequalitative
research.
The Murray took part in an NDIIPP-funded collaboration
with four other archival partners, Data-PASS, for the purpose of
the identification and acquisition of data at risk, and the joint development
of best practices with respect to shared stewardship,
preservation, and exchange of these data. During this time, the
Dataverse Network (DVN) software was introduced, facilitating
the creation of virtual archives. The combination of institutional
collaboration and new technology lead the Murray to re-engineer
its entire acquisition process; completely rewrite its ingest,
dissemination, and other licensing agreements; and adopt a new
model for ingest, discovery, access, and presentation of its collections.
Through the Data-PASS project, the Murray has acquired a
number of important data collections. The resulting changes
within the Murray have been dramatic, including increasing its
overall rate of acquisitions by fourfold; and disseminating acquisitions
far more rapidly. Furthermore, the new licensing and
processing procedures allow a previously undreamed of level of
interoperability and collaboration with partner archives, facilitating
integrated discovery and presentation services, and joint
stewardship of collections.published or submitted for publicatio
Recommended from our members
Cyber insurance of information systems: Security and privacy cyber insurance contracts for ICT and helathcare organizations
Nowadays, more-and-more aspects of our daily activities are digitalized. Data and assets in the cyber-space, both for individuals and organizations, must be safeguarded. Thus, the insurance sector must face the challenge of digital transformation in the 5G era with the right set of tools. In this paper, we present CyberSure-an insurance framework for information systems. CyberSure investigates the interplay between certification, risk management, and insurance of cyber processes. It promotes continuous monitoring as the new building block for cyber insurance in order to overcome the current obstacles of identifying in real-time contractual violations by the insured party and receiving early warning notifications prior the violation. Lightweight monitoring modules capture the status of the operating components and send data to the CyberSure backend system which performs the core decision making. Therefore, an insured system is certified dynamically, with the risk and insurance perspectives being evaluated at runtime as the system operation evolves. As new data become available, the risk management and the insurance policies are adjusted and fine-tuned. When an incident occurs, the insurance company possesses adequate information to assess the situation fast, estimate accurately the level of a potential loss, and decrease the required period for compensating the insured customer. The framework is applied in the ICT and healthcare domains, assessing the system of medium-size organizations. GDPR implications are also considered with the overall setting being effective and scalable
- …