4,532 research outputs found
MoPS: A Modular Protection Scheme for Long-Term Storage
Current trends in technology, such as cloud computing, allow outsourcing the
storage, backup, and archiving of data. This provides efficiency and
flexibility, but also poses new risks for data security. It in particular
became crucial to develop protection schemes that ensure security even in the
long-term, i.e. beyond the lifetime of keys, certificates, and cryptographic
primitives. However, all current solutions fail to provide optimal performance
for different application scenarios. Thus, in this work, we present MoPS, a
modular protection scheme to ensure authenticity and integrity for data stored
over long periods of time. MoPS does not come with any requirements regarding
the storage architecture and can therefore be used together with existing
archiving or storage systems. It supports a set of techniques which can be
plugged together, combined, and migrated in order to create customized
solutions that fulfill the requirements of different application scenarios in
the best possible way. As a proof of concept we implemented MoPS and provide
performance measurements. Furthermore, our implementation provides additional
features, such as guidance for non-expert users and export functionalities for
external verifiers.Comment: Original Publication (in the same form): ASIACCS 201
Predictions Regarding the Evolution Of Archiving Documents Generated by Business Entities
The documents of any organization are created, modified, distributed through various routes, archived, accessed, re-delivered, re-archived. Taking into account that during the last few years the legislative framework regulating the record, inventory, selection, maintenance and use of archives was issued, according to the European regulations the companies are faced with some problems related to knowledge of the legal regulations in effect regarding archiving documents; methods of registering, recording, archiving and accessing the documents; predicting the evolution of the amounts of documents to be archived; predicting the space necessary for storing the archive; supervision of the archive and handling of the archive. The purpose of this article is to support business entities in order to resolve these issues; to determine the amount of archive for a given time period (n), and to estimate the appropriate archiving space.mathematical models, management, data estimation
The best practice for preparation of samples from FTA®cards for diagnosis of blood borne infections using African trypanosomes as a model system
Background: Diagnosis of blood borne infectious diseases relies primarily on the detection of the causative agent
in the blood sample. Molecular techniques offer sensitive and specific tools for this although considerable
difficulties exist when using these approaches in the field environment. In large scale epidemiological studies,
FTA®cards are becoming increasingly popular for the rapid collection and archiving of a large number of samples.
However, there are some difficulties in the downstream processing of these cards which is essential for the
accurate diagnosis of infection. Here we describe recommendations for the best practice approach for sample
processing from FTA®cards for the molecular diagnosis of trypanosomiasis using PCR.
Results: A comparison of five techniques was made. Detection from directly applied whole blood was less
sensitive (35.6%) than whole blood which was subsequently eluted from the cards using Chelex®100 (56.4%).
Better apparent sensitivity was achieved when blood was lysed prior to application on the FTA cards (73.3%)
although this was not significant. This did not improve with subsequent elution using Chelex®100 (73.3%) and was
not significantly different from direct DNA extraction from blood in the field (68.3%).
Conclusions: Based on these results, the degree of effort required for each of these techniques and the difficulty
of DNA extraction under field conditions, we recommend that blood is transferred onto FTA cards whole followed
by elution in Chelex®100 as the best approach
The SDSS and e-science archiving at the University of Chicago Library
The Sloan Digital Sky Survey (SDSS) is a co-operative scientific project involving over 25 institutions worldwide and managed by the Astrophysical Research Consortium (ARC) to map one- quarter of the entire sky in detail, determining the positions and absolute brightness of hundreds of millions of celestial objects. The project was completed in October 2008 and produced over 100 terabytes of data comprised of object catalogs, images, and spectra. While the project remained active, SDSS data was housed at Fermilab. As the project neared completion the SDSS project director (and University of Chicago faculty member) Richard Kron considered options for long term storage and preservation of the data turning to the University of Chicago Library for assistance. In 2007-2008 the University of Chicago Library undertook a pilot project to investigate the feasibility of long term storage and archiving of the project data and providing ongoing access by scientists and educators to the data through the SkyServer user interface. In late 2008 the University of Chicago Library entered into a formal agreement with ARC agreeing to assume responsibility for:
• Archiving of the survey data (long-term scientific data archiving)
• Serving up survey data to the public
• Managing the HelpDesk
• Preserving the SDSS Administrative Record
This paper outlines the various aspects of the project as well as implementation
- …