3,880 research outputs found
Planning and managing the cost of compromise for AV retention and access
Long-term retention and access to audiovisual (AV) assets as part of a preservation strategy inevitably involve some form of compromise in order to achieve acceptable levels of cost, throughput, quality, and many other parameters. Examples include quality control and throughput in media transfer chains; data safety and accessibility in digital storage systems; and service levels for ingest and access for archive functions delivered as services. We present new software tools and frameworks developed in the PrestoPRIME project that allow these compromises to be quantitatively assessed, planned, and managed for file-based AV assets. Our focus is how to give an archive an assurance that when they design and operate a preservation strategy as a set of services, it will function as expected and will cope with the inevitable and often unpredictable variations that happen in operation. This includes being able to do cost projections, sensitivity analysis, simulation of “disaster scenarios,” and to govern preservation services using service-level agreements and policies
Managing system to supervise professional multimedia equipment
Tese de Mestrado Integrado. Engenharia Informática e Computação. Faculdade de Engenharia. Universidade do Porto. 201
Digital Preservation Services : State of the Art Analysis
Research report funded by the DC-NET project.An overview of the state of the art in service provision for digital preservation and curation. Its focus is on the areas where bridging the gaps is needed between e-Infrastructures and efficient and forward-looking digital preservation services. Based on a desktop study and a rapid analysis of some 190 currently available tools and services for digital preservation, the deliverable provides a high-level view on the range of instruments currently on offer to support various functions within a preservation system.European Commission, FP7peer-reviewe
Many-Task Computing and Blue Waters
This report discusses many-task computing (MTC) generically and in the
context of the proposed Blue Waters systems, which is planned to be the largest
NSF-funded supercomputer when it begins production use in 2012. The aim of this
report is to inform the BW project about MTC, including understanding aspects
of MTC applications that can be used to characterize the domain and
understanding the implications of these aspects to middleware and policies.
Many MTC applications do not neatly fit the stereotypes of high-performance
computing (HPC) or high-throughput computing (HTC) applications. Like HTC
applications, by definition MTC applications are structured as graphs of
discrete tasks, with explicit input and output dependencies forming the graph
edges. However, MTC applications have significant features that distinguish
them from typical HTC applications. In particular, different engineering
constraints for hardware and software must be met in order to support these
applications. HTC applications have traditionally run on platforms such as
grids and clusters, through either workflow systems or parallel programming
systems. MTC applications, in contrast, will often demand a short time to
solution, may be communication intensive or data intensive, and may comprise
very short tasks. Therefore, hardware and software for MTC must be engineered
to support the additional communication and I/O and must minimize task dispatch
overheads. The hardware of large-scale HPC systems, with its high degree of
parallelism and support for intensive communication, is well suited for MTC
applications. However, HPC systems often lack a dynamic resource-provisioning
feature, are not ideal for task communication via the file system, and have an
I/O system that is not optimized for MTC-style applications. Hence, additional
software support is likely to be required to gain full benefit from the HPC
hardware
How metadata enables enriched file-based production workflows
As file-based production technology gains industry understanding and commercial products are becoming common-place, many broadcasting and production facilities are commencing re-engineering processes towards file-based production workflows. Sufficient attention, however, should also be spent on the development and incorporation of standardized metadata to reach the full potential of such file-based production environments. In addition to its initial meaning, metadata and underlying data models can represent much more than just some meta-information about audiovisual media assets. In fact, properly modeled metadata can provide the structure that holds various media assets together and guides creative people through production workflows and complex media production tasks. Metadata should hence become a first-class citizen in tomorrow's file-based production facilities The aim of this paper is to show how standardized metadata standards and data models, complemented by custom metadata developments, can be employed practically in a file-based media production environment in order to construct a coherently integrated production platform. The types of metadata are discussed that are exchanged between different parts of the system, which enables the implementation of an entire production workflow and provides seamless integration between different components
Business Process Risk Management and Simulation Modelling for Digital Audio-Visual Media Preservation.
Digitised and born-digital Audio-Visual (AV) content
presents new challenges for preservation and Quality Assurance
(QA) to ensure that cultural heritage is accessible for the long
term. Digital archives have developed strategies for avoiding,
mitigating and recovering from digital AV loss using IT-based
systems, involving QA tools before ingesting files into the archive
and utilising file-based replication to repair files that may be
damaged while in the archive. However, while existing strategies
are effective for addressing issues related to media degradation,
issues such as format obsolescence and failures in processes and
people pose significant risk to the long-term value of digital
AV content. We present a Business Process Risk management
framework (BPRisk) designed to support preservation experts
in managing risks to long-term digital media preservation. This
framework combines workflow and risk specification within a
single risk management process designed to support continual
improvement of workflows. A semantic model has been developed
that allows the framework to incorporate expert knowledge from
both preservation and security experts in order to intelligently
aid workflow designers in creating and optimising workflows.
The framework also provides workflow simulation functionality,
allowing users to a) understand the key vulnerabilities in the
workflows, b) target investments to address those vulnerabilities,
and c) minimise the economic consequences of risks. The application of the BPRisk framework is demonstrated on a use case
with the Austrian Broadcasting Corporation (ORF), discussing
simulation results and an evaluation against the outcomes of
executing the planned workflow
- …