22 research outputs found

    Implementation of Deduplication on Encrypted Big-data using Signcryption for cloud storage applications

    Get PDF
    As Big Data Cloud storage servers are getting widespread the shortage of disc space within the cloud becomes a major concern. The elimination of duplicate or redundant data, particularly in computer data is named deduplication. Data deduplication is a method to regulate the explosive growth of information within the cloud storage, most of the storage providers are finding more secure and efficient methods for their sensitive method. Recently, a noteworthy technique referred to as signcryption has been proposed, in which both the properties of signature (ownership) and encryption are simultaneously implemented with better performance According to deduplication, we introduce a method that can eliminate redundant encrypted data owned by different users. Furthermore, we generate a tag which will be the key component of big data management. We propose a technique called digital signature for ownership verification. Convergent encryption also called for a content hash key cryptosystem. Convergent encryption is an encryption approach that supports deduplication. With this encryption technique, the encryption key is generated out of a hash of plain text. Therefore applying this technique, identical plaintexts would turn out the same ciphertext

    Implementation of Deduplication on Encrypted Big-data using Signcryption for cloud storage applications

    Get PDF
    As Big Data Cloud storage servers are getting widespread the shortage of disc space within the cloud becomes a major concern. The elimination of duplicate or redundant data, particularly in computer data is named deduplication. Data deduplication is a method to regulate the explosive growth of information within the cloud storage, most of the storage providers are finding more secure and efficient methods for their sensitive method. Recently, a noteworthy technique referred to as signcryption has been proposed, in which both the properties of signature (ownership) and encryption are simultaneously implemented with better performance According to deduplication, we introduce a method that can eliminate redundant encrypted data owned by different users. Furthermore, we generate a tag which will be the key component of big data management. We propose a technique called digital signature for ownership verification. Convergent encryption also called for a content hash key cryptosystem. Convergent encryption is an encryption approach that supports deduplication. With this encryption technique, the encryption key is generated out of a hash of plain text. Therefore applying this technique, identical plaintexts would turn out the same ciphertext

    Reliable method for Authorized Deduplication by Using Hybrid Cloud Environment

    Get PDF
    Information deduplication is one of vital data compression procedures for eliminating duplicate copies of repeating information and has been broadly utilized as a part of cloud storage to diminish the measure of storage room and spare transmission capacity .To ensure secrecy of delicate information while supporting deduplication, united encryption system has been proposed to scramble the information before out sourcing. For better assurance of information security , primary endeavor is to formally address the issue of approved information deduplication .Unique in relation to conventional deduplication frameworks ,the proposed security display bolsters differential benefits of clients in copy check other than information itself. A few new deduplication developments are displayed supporting approved copy check in a hybrid cloud environments. Security investigation shows that this plan is secure as far as definitions determined in proposed security demonstrate. As a proof of idea, the usage of a model of the proposed approved copy check conspire and additionally the lead testbed tests utilizing the model causes negligible overhead contrasted with ordinary operations

    DATA COMPRESSION TECHNIQUE FOR CLOUD STORAGE IN MULTI-USER ENVIRONMENTS

    Get PDF
    In many associations, the cloud storages contain a copy (duplicates) of information. For example, the same file would be spared in various locations by different users; too many documents that are not differentiable may in any case duplication a great part of the same data. This consumes lots of memory in cloud storage. Deduplication is a kind of technique used to eliminate duplicate entries on cloud storage. It helps us to remove the duplicate entries of these additional duplicates, by maintaining only one duplicate of the information and giving ownership rights to the alternate duplicates with pointers that lead back to the first duplicate. Organizations much of the time utilizes this technique in cloud storage for many applications, yet it can be effectively utilized to free up space in vital situations also. To allow an end-user to have the ownership of the file if the file is already existing and should terminate the uploading process to the cloud storage serve

    A Dynamic Proxy Oriented Approach for Remote Data Integrity checking along with Secure Deduplication in Cloud

    Get PDF
    In Cloud computing users store data over remote servers instead of computer�s hard drive. This leads to several security problems since data is out of the control of the user. So, to protect against the security attacks and to preserve the data integrity in the cloud, Huaqun Wang et.al proposed proxy oriented remote data integrity checking (RDIC). However, this scheme only focuses on one-way validation i.e clients have to know whether their files are stored integrally in the cloud. But this scheme does not address the problem of duplication which is essential with increasing demand for cloud storage. And as users are untrusted from the perspective of the server, there is a need to prove the ownership of the files. The proposed work considers the requirement of mutual validation. In this paper we propose a new construction of Identity based RDIC along with secure deduplication. The proposed scheme avoids burden of complex key management and flexible as it support anyone to verify the contents of the data apart from the data owner and incurs less computation cost as token generation is done by the proxy instead of user

    Optimal Data Deduplication In Cloud With Authorization

    Get PDF
    Cloud technology is widely used technology as it allows sharing and centralized storage of data, sharing of data processing and online access of computer services and resources on various types of devices. One of the critical challenges of cloud storage services is the management of the ever-increasing volume of data .To address these data deduplication is one of the novel technique. Deduplication helps to remove and prevent from having duplicate copies of same data. Though deduplication has several benefits it adds concerns related to privacy and security of users as it can lead to insider or outsider attacks. Achieving deduplication along with data security in cloud environment makes it more critical problem to solve. Objective of this paper on Optima Authorized Data Deduplication in Cloud is to mention the proposed system and analysis of deduplication techniques and optimal authorization measures for security along with deduplication technique in cloud environment DOI: 10.17762/ijritcc2321-8169.15073

    A Novel Approach For Improve Reliability Using Secure Distributed De-duplication System In Cloud

    Get PDF
    Information De-duplication strategy is utilized for wiping out the copy duplicates of rehashed information in distributed storage and to diminish the information duplication. This method is utilized to enhance stockpiling use furthermore be connected to network information exchanges to diminish the quantity of bytes that must be sent. Keeping numerous information duplicates with the comparative substance, de-duplication disposes of excess information by keeping one and only physical duplicate and allude other repetitive information to that duplicate. Information de-duplication happens document level and also square level. The copy duplicates of indistinguishable document take out by record level de-duplication. For the piece level duplication which takes out copies squares of information that happen in non-indistinguishable records. In spite of the fact that information deduplication takes a great deal of advantages, security, and in addition protection concerns, emerges as client's delicate information are able to both insider and outcast assaults. In the conventional encryption giving information privacy, is opposing with information de-duplication. To keep up trustworthiness we are giving the Third Party Auditor plot which makes the review of the record put away at cloud and advises the information proprietor about document status put away at cloud server. This framework underpins security difficulties, for example, an approved copy check, honesty, information classification and unwavering quality. In this paper new disseminated deduplication frameworks with higher dependability in which the information lumps are circulated over different cloud servers is being proposed

    A secure privacy preserving deduplication scheme for cloud computing

    Full text link
    © 2019 Elsevier B.V. Data deduplication is a key technique to improve storage efficiency in cloud computing. By pointing redundant files to a single copy, cloud service providers greatly reduce their storage space as well as data transfer costs. Despite of the fact that the traditional deduplication approach has been adopted widely, it comes with a high risk of losing data confidentiality because of the data storage models in cloud computing. To deal with this issue in cloud storage, we first propose a TEE (trusted execution environment) based secure deduplication scheme. In our scheme, each cloud user is assigned a privilege set; the deduplication can be performed if and only if the cloud users have the correct privilege. Moreover, our scheme augments the convergent encryption with users’ privileges and relies on TEE to provide secure key management, which improves the ability of such cryptosystem to resist chosen plaintext attacks and chosen ciphertext attacks. A security analysis indicates that our scheme is secure enough to support data deduplication and to protect the confidentiality of sensitive data. Furthermore, we implement a prototype of our scheme and evaluate the performance of our prototype, experiments show that the overhead of our scheme is practical in realistic environments

    A Reliability-Aware Approach for Resource Efficient Virtual Network Function Deployment

    Get PDF
    OAPA Network function virtualization (NFV) is a promising technique aimed at reducing capital expenditures (CAPEX) and operating expenditures (OPEX), and improving the flexibility and scalability of an entire network. In contrast to traditional dispatching, NFV can separate network functions from proprietary infrastructure and gather these functions into a resource pool that can efficiently modify and adjust service function chains (SFCs). However, this emerging technique has some challenges. A major problem is reliability, which involves ensuring the availability of deployed SFCs, namely, the probability of successfully chaining a series of virtual network functions (VNFs) while considering both the feasibility and the specific requirements of clients, because the substrate network remains vulnerable to earthquakes, floods and other natural disasters. Based on the premise of users & #x2019; demands for SFC requirements, we present an Ensure Reliability Cost Saving (ER & #x005F;CS) algorithm to reduce the CAPEX and OPEX of telecommunication service providers (TSPs) by reducing the reliability of the SFC deployments. The results of extensive experiments indicate that the proposed algorithms perform efficiently in terms of the blocking ratio, resource consumption, time consumption and the first block
    corecore