856 research outputs found

    A secure archive for Voice-over-IP conversations

    Full text link
    An efficient archive securing the integrity of VoIP-based two-party conversations is presented. The solution is based on chains of hashes and continuously chained electronic signatures. Security is concentrated in a single, efficient component, allowing for a detailed analysis.Comment: 9 pages, 2 figures. (C) ACM, (2006). This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of VSW06, June, 2006, Berlin, German

    MoPS: A Modular Protection Scheme for Long-Term Storage

    Full text link
    Current trends in technology, such as cloud computing, allow outsourcing the storage, backup, and archiving of data. This provides efficiency and flexibility, but also poses new risks for data security. It in particular became crucial to develop protection schemes that ensure security even in the long-term, i.e. beyond the lifetime of keys, certificates, and cryptographic primitives. However, all current solutions fail to provide optimal performance for different application scenarios. Thus, in this work, we present MoPS, a modular protection scheme to ensure authenticity and integrity for data stored over long periods of time. MoPS does not come with any requirements regarding the storage architecture and can therefore be used together with existing archiving or storage systems. It supports a set of techniques which can be plugged together, combined, and migrated in order to create customized solutions that fulfill the requirements of different application scenarios in the best possible way. As a proof of concept we implemented MoPS and provide performance measurements. Furthermore, our implementation provides additional features, such as guidance for non-expert users and export functionalities for external verifiers.Comment: Original Publication (in the same form): ASIACCS 201

    Secure Digital Asset Repository

    Get PDF
    As the industry evolves into technology driven businesses, an increasing number of companies are reaching a critical pain in needing to control and manage their vast amounts of digital media assets. Technically speaking, a digital asset is any form of media that has been turned into a binary source. Digital assets, which for textile mills include everything from artwork, logos, photos, text documents and even email, are proving to be valuable assets in terms of both productivity and company valuation. Many organizations thought that secure digital asset repository as part of their business-critical strategy for managing and sharing digital assets. They need dedicated solution to help overcome operational and organisational challenges unique to them. Secure digital asset repository allows company to create a custombranded, password-protected area where company can exchange business files with clients easily, securely, and professionally. Whether company deal with files that are simply too large to transfer by email, need secure file transfer or just need a certain data being revealed, secure digital asset repository can provide a solution. This paper focuses several features in the secure digital asset repository application

    Long-term Information Preservation and Access

    Get PDF
    An unprecedented amount of information encompassing almost every facet of human activities across the world is generated daily in the form of zeros and ones, and that is often the only form in which such information is recorded. A good fraction of this information needs to be preserved for periods of time ranging from a few years to centuries. Consequently, the problem of preserving digital information over a long-term has attracted the attention of many organizations, including libraries, government agencies, scientific communities, and individual researchers. In this dissertation, we address three issues that are critical to ensure long-term information preservation and access. The first concerns the core requirement of how to guarantee the integrity of preserved contents. Digital information is in general very fragile because of the many ways errors can be introduced, such as errors introduced because of hardware and media degradation, hardware and software malfunction, operational errors, security breaches, and malicious alterations. To address this problem, we develop a new approach based on efficient and rigorous cryptographic techniques, which will guarantee the integrity of preserved contents with extremely high probability even in the presence of malicious attacks. Our prototype implementation of this approach has been deployed and actively used in the past years in several organizations, including the San Diego Super Computer Center, the Chronopolis Consortium, North Carolina State University, and more recently the Government Printing Office. Second, we consider another crucial component in any preservation system - searching and locating information. The ever-growing size of a long-term archive and the temporality of each preserved item introduce a new set of challenges to providing a fast retrieval of content based on a temporal query. The widely-used cataloguing scheme has serious scalability problems. The standard full-text search approach has serious limitations since it does not deal appropriately with the temporal dimension, and, in particular, is incapable of performing relevancy scoring according to the temporal context. To address these problems, we introduce two types of indexing schemes - a location indexing scheme, and a full-text search indexing scheme. Our location indexing scheme provides optimal operations for inserting and locating a specific version of a preserved item given an item ID and a time point, and our full-text search indexing scheme efficiently handles the scalability problem, supporting relevancy scoring within the temporal context at the same time. Finally, we address the problem of organizing inter-related data, so that future accesses and data exploration can be quickly performed. We, in particular, consider web contents, where we combine a link-analysis scheme with a graph partitioning scheme to put together more closely related contents in the same standard web archive container. We conduct experiments that simulate random browsing of preserved contents, and show that our data organization scheme greatly minimizes the number of containers needed to be accessed for a random browsing session. Our schemes have been tested against real-world data of significant scale, and validated through extensive empirical evaluations

    Rethinking the e-signatures Directive: on laws, trust services, and the digital single market

    Get PDF
    Hans Graux sets out his views about electronic identification, authentication, and electronic signatures in the light of the decision of the EU to conduct a public consultation on these topics and related trusted services or credentials in the light of the "Feasibility study on an electronic identification, authentication and signature policy (IAS)"

    ACE: A Novel Software Platform to Ensure the Integrity of Long Term Archives

    Get PDF
    We develop a new methodology to address the integrity of long term archives using rigorous cryptographic techniques. A prototype system called ACE (Auditing Control Environment) was designed and developed based on this methodology. ACE creates a small-size integrity token for each digital object and some cryptographic summary information based on all the objects handled within a dynamic time period. ACE continuously audits the contents of the various objects according to the policy set by the archive, and provides mechanisms for an independent third-party auditor to certify the integrity of any object. In fact, our approach will allow an independent auditor to verify the integrity of every version of an archived digital object as well as link the current version to the original form of the object when it was ingested into the archive. We show that ACE is very cost effective and scalable while making no assumptions about the archive architecture. We include in this paper some preliminary results on the validation and performance of ACE on a large image collection

    Trustworthy and Efficient Protection Schemes for Digital Archiving

    Get PDF
    The amount of information produced in the last decades has grown notably. Much of this information only exists in the form of electronic documents and it has often to be stored for long periods. Therefore, digital archives are increasingly needed. However, for the documents to remain trustworthy while they are archived, they need to be protected by the archivists. Important protection goals that must be guaranteed are integrity, authenticity, non-repudiation, and proof of existence. To address these goals, several protection schemes for digital archives have been designed. These schemes are usually based on cryptographic primitives, namely digital signatures and hash functions. However, since documents can be archived for decades or even indefinitely, the used cryptographic primitives can become insecure during the archival time. This is a serious issue because it can be exploited by attackers to compromise the protection goals of the archived documents. Therefore, a requirement for long-term protection schemes is to address the aging of cryptography, i.e. replacing the used primitives properly before they become insecure. In this work we analyze and improve long-term protection schemes for digital archives. More precisely, we aim at answering three questions. (1) How do long-term protection schemes compare with respect to trustworthiness? (2) How do they differ in performance? (3) Can new schemes be designed, which generate more efficient and trustworthy evidence needed to establish the protection goals? Although several protection schemes can be found in the literature, many of them fail in addressing the aging of cryptography. Therefore, our first step is to identify which existing schemes provide long-term protection with respect to integrity, authenticity, non-repudiation, and proof of existence. Afterwards, to answer question (1) we analyze the trustworthiness of the long-term protection schemes using two approaches. In the first approach, we initially identify the required trust assumptions. Then, based on these assumptions, we compare the protection schemes. In the second approach, we turn to quantifying the trustworthiness of the evidence generated by time-stamping and notarial schemes. To this end, we use a belief trust model and design a reputation system. This leads to two further, more detailed answers to question (1). First, that trustworthiness depends on the reputation of the involved parties rather than the protection schemes themselves. Second, the trustworthiness of evidence tends to degrade in the long term. Therefore, we propose to use the reputation system to create incentives for the involved parties to build good reputation. This raises the trustworthiness of generated evidence, hence addressing question (3). Next, we address question (2) by analyzing how schemes differ in performance using an analytical evaluation and experiments. More precisely, we measure the times needed to create and verify evidence, the space required to store evidence, and the communication necessary to generate evidence. Moreover, this analysis shows that while verifying evidence most of the time is spent on checking certificate chains. The findings in the performance analysis provide us with directions for addressing question (3). We propose three new solutions that provide more efficient evidence. The first solution is a new notarial scheme that generates smaller evidence and that communicates less data than the existing notarial scheme. Novelties in our scheme include balancing the numbers of signatures that users and notaries verify, and using notaries as time-stamp authorities to provide proof of existence. The second solution is based on the time-stamping scheme Content Integrity Service (CIS) and allows for faster evidence verification. To the best of our knowledge, CIS is the only scheme designed for an archive where documents are submitted and time-stamped sequentially but share the same sequence of time-stamps. However, in this case the validities of several time-stamps in this sequence may overlap. Consequently, many of these time-stamps need not be checked when verifying the time-stamp sequence for one document. We address this issue in our new scheme by using a data structure called skip list. The result is a time-stamp sequence where users can skip the time-stamps that are not necessary to guarantee the protection goals of one document. Using an analytical evaluation and experiments, we show that our scheme is notably faster than CIS. The third solution is intended to reduce time spent on checking certificate chains when verifying evidence generated by time-stamping schemes. More precisely, we improve an existing public key infrastructure-based solution where the root certification authority generates smaller verification information for time-stamps. This verification information can be used to replace the certificate chains needed to verify time-stamps. However, this solution requires extra work from time-stamp authorities and the root certification authority, especially when the number of time-stamps grows significantly. In our solution, this issue is addressed such that this extra work is independent of the number of time-stamps. Using an analytical evaluation we demonstrate the advantage of our solution. Finally, we provide our conclusions and future work. In this thesis we design new solutions that allow for more efficient and trustworthy evidence of protection for archived documents. As future work, we suggest conducting more research in the direction of developing methods that address the decay of the trustworthiness of evidence over time

    Techniques to Audit and Certify the Long Term Integrity of Digital Archives

    Get PDF
    A large portion of the government, business, cultural, and scientific digital data being created today needs to be archived and preserved for future use of periods ranging from a few years to decades and sometimes centuries. A fundamental requirement for a long term archive is to set up mechanisms that will ensure the authenticity of the holdings of the archive. In this paper, we develop a new methodology to address the integrity of long term archives using rigorous cryptographic techniques. Our approach involves the generation of a small-size integrity token for each digital object to be archived, and some cryptographic summary information based on all the objects handled within a dynamic time period. We present a framework that enables the continuous auditing of the holdings of the archive, as well as auditing upon access, depending on the policy set by the archive. Moreover, an independent auditor will be able to verify the integrity of every version of an archived digital object as well as link the current version to the original form of the object when it was ingested into the archive. Using this approach, a prototype system called ACE (Auditing Control Environment) has been built and tested. ACE is scalable and cost effective, and is completely independent of the archive's underlying architecture

    GLOBOIDS for a Seamless Cross Border Mobility Experience

    Get PDF
    The Covid-19 pandemic has brought the global border management issue to the surface. Policymakers are deliberating on mitigating the situation by employing measures for immediate resolution and are working on future sustainability. The purpose of this research is to address the scale, scope, and complexity of border governance that not only entails minimal human intervention but also global collaborative action. A design science research methodology was utilized to design the proposed artefacts, to enable a high-level understanding of the global border affairs, and to introduce a potential solution for deliberation, discussion, and future digital. A global border management intelligent distributed system has been conceptualized to be contactless and concerted. It also envisions improving efficiency, performance, and seamless and secure cross-border mobility in the post-pandemic new normal

    Post-Quantum Secure Time-Stamping

    Get PDF
    Krüptograafilisi ajatempliprotokolle kasutatakse tõestusena, et üks dokument eksisteeris enne teist. Postkvantkrüptograafiliselt turvalised ajatempliprotokollid uurivad, kas neid tõestusi on võimalik võltsida kasutades kvantarvuteid. Tegu on suuresti uurimata alaga, kuna võtmeta ajatempliprotokollides kasutatavates primitiivides pole seni leitud kvantarvutite kontekstis tõsiseid nõrkusi. Selles töös me defineerime, mis on post-kvant turvalised ajatempliprotokollid ning uurime kuidas klassikalised tulemused muutuvad uues raamistikus. Suur erinevus kvantvastaste puhul on see, et meil ei ole võimalik saada suvalise kvantalgoritmi mitut erinevat käivitust. Tänapäeval teadaolevad tagasipööramise võtted võimaldavad kvantalgoritmi tagasi pöörata ainult väga kindlatel tingimustel. Me uurime nende võtete kombineerimise võimalikkust ühe teoreemi tõestamiseks. Sellele teoreemile ei ole hetkel post-kvant standardmudelis ühtegi tõestust. Me pakume tõestuseta ühe tagasipööramise konstruktsiooni, mille abil võib osutuda teoreemi tõestamine võimalikuks. Me lisaks pakume välja ka minimaalse lahendamata probleemi, mis on esimene samm teoreemi formaalse tõestamiseni.Cryptographic timestamps are used as proof that a certain document existed before another. Post-quantum secure time-stamping examines whether these proofs can be forged using a quantum computer. The field is very unexplored as the primitives used in keyless time-stamping have not shown any serious weakness towards quantum computers. Until now no effort had been made towards formally defining post-quantum secure time-stamping. In this work, we define the notion of post-quantum time-stamping and examine how contemporary classical results change in this new framework. A key difference in the post-quantum setting is that we cannot retrieve multiple separate executions of an arbitrary quantum adversary. Currently known rewinding techniques allow an adversary to be ran again only under very specific conditions. We examine the possibility of combining existing rewinding techniques to prove a theorem for which there is currently no proof in the standard post-quantum model. We conjecture a rewinding construction which could possibly prove the theorem and establish a minimal open problem for formally proving the theorem
    corecore