219 research outputs found

    Efficient integrity verification of replicated data in cloud

    Get PDF
    The cloud computing is an emerging model in which computing infrastructure resources are provided as a service over the Internet. Data owners can outsource their data by remotely storing them in the cloud and enjoy on-demand high quality services from a shared pool of configurable computing resources. By using these data storage services, the data owners can relieve the burden of local data storage and maintenance. However, since data owners and the cloud servers are not in the same trusted domain, the outsourced data may be at risk as the cloud server may no longer be fully trusted. Therefore, data integrity is of critical importance in such a scenario. Cloud should let the owners or a trusted third party to check for the integrity of their data storage without demanding a local copy of the data. Owners often replicate their data on the cloud servers across multiple data centers to provide a higher level of scalability, availability, and durability. When the data owners ask the Cloud Service Provider (CSP) to replicate data, they are charged a higher storage fee by the CSP. Therefore, the data owners need to be strongly convinced that the CSP is storing data copies agreed on in the service level contract, and data-updates have been correctly executed on all the remotely stored copies. In this thesis, a Dynamic Multi-Replica Provable Data Possession scheme (DMR-PDP) is proposed that prevents the CSP from cheating; for example, by maintaining fewer copies than paid for and/or tampering data. In addition, we also extended the scheme to support a basic file versioning system where only the difference between the original file and the updated file is propagated rather than the propagation of operations for privacy reasons. DMR-PDP also supports efficient dynamic operations like block modification, insertion and deletion on replicas over the cloud servers --Abstract, page iii

    Efficient Method Based on Blockchain Ensuring Data Integrity Auditing with Deduplication in Cloud

    Get PDF
    With the rapid development of cloud storage, more and more cloud clients can store and access their data anytime, from anywhere and using any device. Data deduplication may be considered an excellent choice to ensure data storage efficiency. Although cloud technology offers many advantages for storage service, it also introduces security challenges, especially with regards to data integrity, which is one of the most critical elements in any system. A data owner should thus enable data integrity auditing mechanisms. Much research has recently been undertaken to deal with these issues. In this paper, we propose a novel blockchain-based method, which can preserve cloud data integrity checking with data deduplication. In our method, a mediator performs data deduplication on the client side, which permits a reduction in the amount of outsourced data and a decrease in the computation time and the bandwidth used between the enterprise and the cloud service provider. This method supports private and public auditability. Our method also ensures the confidentiality of a client's data against auditors during the auditing process

    Blockchain & Multi-Agent System: A New Promising Approach for Cloud Data Integrity Auditing with Deduplication

    Get PDF
    Recently, data storage represents one of the most important services in Cloud Computing. The cloud provider should ensure two major requirements which are data integrity and storage efficiency. Blockchain data structure and the efficient data deduplication represent possible solutions to address these exigencies. Several approaches have been proposed, some of them implement deduplication in Cloud server side, which involves a lot of computation to eliminate the redundant data and it becomes more and more complex. Therefore, this paper proposed an efficient, reliable and secure approach, in which the authors propose a Multi-Agent System in order to manipulate deduplication technique that permits to reduce data volumes thereby reduce storage overhead. On the other side, the loss of physical control over data introduces security challenges such as data loss, data tampering and data modification. To solve similar problems, the authors also propose Blockchain as a database for storing metadata of client files. This database serves as logging database that ensures data integrity auditing function

    Double Encryption Based Auditing Protocol Using Dynamic Operation in Cloud Storage

    Get PDF
    Using Cloud Storage, users can tenuously store their data and enjoy the on-demand great quality applications and facilities from a shared pool of configurable computing resources, without the problem of local data storage and maintenance. However, the fact that users no longer have physical possession of the outsourced data makes the data integrity protection in Cloud Computing a formidable task, especially for users with constrained dividing resources. From users? perspective, including both individuals and IT systems, storing data remotely into the cloud in a flexible on-demand manner brings tempting benefits: relief of the burden for storage management, universal data access with independent geographical locations, and avoidance of capital expenditure on hardware, software, and personnel maintenances, etc. . To securely introduce an effective third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to capably audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user; 2) The third party auditing process should take in no new vulnerabilities towards user data privacy. In this project, utilize and uniquely combine the public auditing protocols with double encryption approach to achieve the privacy-preserving public cloud data auditing system, which meets all integrity checking without any leakage of data. To support efficient handling of multiple auditing tasks, we further explore the technique of online signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously. We can implement double encryption algorithm encrypt the data twice and stored cloud server

    Distributed authorization in loosely coupled data federation

    Get PDF
    The underlying data model of many integrated information systems is a collection of inter-operable and autonomous database systems, namely, a loosely coupled data federation. A challenging security issue in designing such a data federation is to ensure the integrity and confidentiality of data stored in remote databases through distributed authorization of users. Existing solutions in centralized databases are not directly applicable here due to the lack of a centralized authority, and most solutions designed for outsourced databases cannot easily support frequent updates essential to a data federation. In this thesis, we provide a solution in three steps. First, we devise an architecture to support fully distributed, fine-grained, and data-dependent authorization in loosely coupled data federations. For this purpose, we adapt the integrity-lock architecture originally designed for multilevel secure databases to data federations. Second, we propose an integrity mechanism to detect, localize, and verify updates of data stored in remote databases while reducing communication overhead and limiting the impact of unauthorized updates. We realize the mechanism as a three-stage procedure based on a grid of Merkle Hash Trees built on relational tables. Third, we present a confidentiality mechanism to control remote users' accesses to sensitive data while allowing authorization policies to be frequently updated. We achieve this objective through a new over-encryption scheme based on secret sharing. Finally, we evaluate the proposed architecture and mechanisms through experiments

    Survey on securing data storage in the cloud

    Get PDF
    Cloud Computing has become a well-known primitive nowadays; many researchers and companies are embracing this fascinating technology with feverish haste. In the meantime, security and privacy challenges are brought forward while the number of cloud storage user increases expeditiously. In this work, we conduct an in-depth survey on recent research activities of cloud storage security in association with cloud computing. After an overview of the cloud storage system and its security problem, we focus on the key security requirement triad, i.e., data integrity, data confidentiality, and availability. For each of the three security objectives, we discuss the new unique challenges faced by the cloud storage services, summarize key issues discussed in the current literature, examine, and compare the existing and emerging approaches proposed to meet those new challenges, and point out possible extensions and futuristic research opportunities. The goal of our paper is to provide a state-of-the-art knowledge to new researchers who would like to join this exciting new field

    Enabling Public Verifiability and Data Dynamics for Storage Security

    Get PDF
    Cloud Computing has been envisioned as the next-generation architecture of IT Enterprise. It moves the application software and databases to the centralized large data centers, where the management of the data and services may not be fully trustworthy. This unique paradigm brings about many new security challenges, which have not been well understood. This work studies the problem of ensuring the integrity of data storage in Cloud Computing. In particular, we consider the task of allowing a third party auditor (TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the cloud. The introduction of TPA eliminates the involvement of client through the auditing of whether his data stored in the cloud is indeed intact, which can be important in achieving economies of scale for Cloud Computing. The support for data dynamics via the most general forms of data operation, such as block modification, insertion and deletion, is also a significant step toward practicality, since services in Cloud Computing are not limited to archive or backup data only. While prior works on ensuring remote data integrity often lacks the support of either public verifiability or dynamic data operations, this paper achieves both. We first identify the difficulties and potential security problems of direct extensions with fully dynamic data updates from prior works and then show how to construct an elegant verification scheme for seamless integration of these two salient features in our protocol design. In particular, to achieve efficient data dynamics, we improve the Proof of Retrievability model \cite{Shacham:ASIACRYPT:2008} by manipulating the classic Merkle Hash Tree (MHT) construction for block tag authentication. Extensive security and performance analysis show that the proposed scheme is highly efficient and provably secure

    Best-effort authentication for opportunistic networks

    Get PDF
    corecore