374 research outputs found

    10291 Abstracts Collection -- Automation in Digital Preservation

    Get PDF
    Digital Preservation has evolved into a specialized, interdisciplinary research discipline of its own, seeing significant increases in terms of research capacity, results, but also challenges. However, with this specialization and subsequent formation of a dedicated subgroup of researchers active in this field, limitations of the challenges addressed can be observed. Digital preservation research may seem to react to problems arising, fixing problems that exist now, rather than proactively researching new solutions that may be applicable only after a few years of maturing. Recognising the benefits of bringing together researchers and practitioners with various professional backgrounds related to digital preservation, a seminar was organized in Schloss Dagstuhl, at the Leibniz Center for Informatics (18-23 July 2010), with the aim of addressing the current digital preservation challenges, with a specific focus on the automation aspects in this field. The main goal of the seminar was to outline some research challenges in digital preservation, providing a number of “research questions” that could be immediately tackled, e.g. in Doctoral Thesis. The seminar intended also to highlight the need for the digital preservation community to reach out to IT research and other research communities outside the immediate digital preservation domain, in order to jointly develop solutions

    Automatic evaluation of migration quality in distributed networks of converters

    Get PDF
    Migration has always played an important role in digital preservation. The most recent developments in this context introduce networks of remotely distributed converters. In this paper we propose an extension to current migration networks to enable users to: a) determine to what extent have the essential characteristics of digital objects been preserved during a migration; b) generate detailed migration reports for inclusion in the objects’ preservation metadata; and c) provide advice on the available migration paths or target formats that will best suit users’ requirements.Fundação para a Ciência e a Tecnologia (FCT

    Toward Automating Web Protocol Configuration for a Programmable Logic Controller Emulator

    Get PDF
    Industrial Control Systems (ICS) remain vulnerable through attack vectors that exist within programmable logic controllers (PLC). PLC emulators used as honeypots can provide insight into these vulnerabilities. Honeypots can sometimes deter attackers from real devices and log activity. A variety of PLC emulators exist, but require manual figuration to change their PLC pro le. This limits their flexibility for deployment. An automated process for configuring PLC emulators can open the door for emulation of many types of PLCs. This study investigates the feasibility of creating such a process. The research creates an automated process for figuring the web protocols of a Koyo DirectLogic PLC. The figuration process is a software program that collects information about the PLC and creates a behavior pro le. A generic web server then references that pro le in order to respond properly to requests. To measure the ability of the process, the resulting emulator is evaluated based on response accuracy and timing accuracy. In addition, the figuration time of the process itself is measured. For the accuracy measurements a workload of 1000 GET requests are sent to the index.html page of the PLC, and then to the emulator. These requests are sent at two rates: Slow and PLC Break. The emulator responses are then compared to those of the PLC baseline. Results show that the process completes in 9.8 seconds, on average. The resulting emulator responds with 97.79% accuracy across all trials. It responds 1.3 times faster than the real PLC at the Slow response rate, and 1.4 times faster at the PLC Break rate. Results indicate that the automated process is able to create an emulator with an accuracy that is comparable to a manually figured emulator. This supports the hypothesis that creating an automated process for figuring a PLC emulator with a high level of accuracy is feasible

    VXA: A Virtual Architecture for Durable Compressed Archives

    Full text link
    Data compression algorithms change frequently, and obsolete decoders do not always run on new hardware and operating systems, threatening the long-term usability of content archived using those algorithms. Re-encoding content into new formats is cumbersome, and highly undesirable when lossy compression is involved. Processor architectures, in contrast, have remained comparatively stable over recent decades. VXA, an archival storage system designed around this observation, archives executable decoders along with the encoded content it stores. VXA decoders run in a specialized virtual machine that implements an OS-independent execution environment based on the standard x86 architecture. The VXA virtual machine strictly limits access to host system services, making decoders safe to run even if an archive contains malicious code. VXA's adoption of a "native" processor architecture instead of type-safe language technology allows reuse of existing "hand-optimized" decoders in C and assembly language, and permits decoders access to performance-enhancing architecture features such as vector processing instructions. The performance cost of VXA's virtualization is typically less than 15% compared with the same decoders running natively. The storage cost of archived decoders, typically 30-130KB each, can be amortized across many archived files sharing the same compression method.Comment: 14 pages, 7 figures, 2 table

    Software Emulation and the Video Game Community: A Web Content Analysis of User Needs

    Get PDF
    In the past decade, archives have utilized emulation to preserve older pieces of software including video games and make them accessible to patrons. However, recent literature neglects user needs for emulated software in archives. Using web content analysis, this study examines the text of nearly 1,200 online comments, threads, and forum posts about software emulation from four different websites. The findings suggest that audiences are keenly aware of software emulation but not of emulation as a way to preserve video games. It also found that user needs are often unique or even contradictory, and much user attention is paid to the visual quality of emulation systems as well as the quality of the emulated experience. The author suggests a number of policy proposals for libraries and archives that argue greater public input is necessary in the creation and development of systems that preserve software through the process of emulation.Master of Science in Library Scienc

    QoE-aware inter-stream synchronization in open N-screens cloud

    Get PDF
    The growing popularity and increasing performance of mobile devices is transforming the way in which media can be consumed, from single device playback to orchestrated multi-stream experiences across multiple devices. One of the biggest challenges in realizing such immersive media experience is the dynamic management of synchronicity between associated media streams. This is further complicated by the faceted aspects of user perception and heterogeneity of user devices and networks. This paper introduces a QoE-aware open inter-stream media synchronization framework (IMSync). IMSync employs efficient monitoring and control mechanisms, as well as a bespoke QoE impact model derived from subjective user experiments. Given a current lag, IMSync's aim is to use the impact model to determine a good catch-up strategy that minimizes detrimental impact on QoE. The impact model balances the accumulative impact of re-synchronization processes and the degree of non-synchronicity to ensure the QoE. Experimental results verify the run-time performance of the framework as a foundation for immersive media experience in open N-Screens cloud

    Closing the gap: human factors in cross-device media synchronization

    Get PDF
    The continuing growth in the mobile phone arena, particularly in terms of device capabilities and ownership is having a transformational impact on media consumption. It is now possible to consider orchestrated multi-stream experiences delivered across many devices, rather than the playback of content from a single device. However, there are significant challenges in realising such a vision, particularly around the management of synchronicity between associated media streams. This is compounded by the heterogeneous nature of user devices, the networks upon which they operate, and the perceptions of users. This paper describes IMSync, an open inter-stream synchronisation framework that is QoE-aware. IMSync adopts efficient monitoring and control mechanisms, alongside a QoE perception model that has been derived from a series of subjective user experiments. Based on an observation of lag, IMSync is able to use this model of impact to determine an appropriate strategy to catch-up with playback whilst minimising the potential detrimental impacts on a users QoE. The impact model adopts a balanced approach: trading off the potential impact on QoE of initiating a re-synchronisation process compared with retaining the current levels of non-synchronicity, in order to maintain high levels of QoE. A series of experiments demonstrate the potential of the framework as a basis for enabling new, immersive media experiences

    Report on the “Digital Preservation - The Planets Way” Workshop

    Full text link
    • …
    corecore