576 research outputs found

    A comprehensive meta-analysis of cryptographic security mechanisms for cloud computing

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The concept of cloud computing offers measurable computational or information resources as a service over the Internet. The major motivation behind the cloud setup is economic benefits, because it assures the reduction in expenditure for operational and infrastructural purposes. To transform it into a reality there are some impediments and hurdles which are required to be tackled, most profound of which are security, privacy and reliability issues. As the user data is revealed to the cloud, it departs the protection-sphere of the data owner. However, this brings partly new security and privacy concerns. This work focuses on these issues related to various cloud services and deployment models by spotlighting their major challenges. While the classical cryptography is an ancient discipline, modern cryptography, which has been mostly developed in the last few decades, is the subject of study which needs to be implemented so as to ensure strong security and privacy mechanisms in today’s real-world scenarios. The technological solutions, short and long term research goals of the cloud security will be described and addressed using various classical cryptographic mechanisms as well as modern ones. This work explores the new directions in cloud computing security, while highlighting the correct selection of these fundamental technologies from cryptographic point of view

    Secure Cloud Storage with Client-Side Encryption Using a Trusted Execution Environment

    Full text link
    With the evolution of computer systems, the amount of sensitive data to be stored as well as the number of threats on these data grow up, making the data confidentiality increasingly important to computer users. Currently, with devices always connected to the Internet, the use of cloud data storage services has become practical and common, allowing quick access to such data wherever the user is. Such practicality brings with it a concern, precisely the confidentiality of the data which is delivered to third parties for storage. In the home environment, disk encryption tools have gained special attention from users, being used on personal computers and also having native options in some smartphone operating systems. The present work uses the data sealing, feature provided by the Intel Software Guard Extensions (Intel SGX) technology, for file encryption. A virtual file system is created in which applications can store their data, keeping the security guarantees provided by the Intel SGX technology, before send the data to a storage provider. This way, even if the storage provider is compromised, the data are safe. To validate the proposal, the Cryptomator software, which is a free client-side encryption tool for cloud files, was integrated with an Intel SGX application (enclave) for data sealing. The results demonstrate that the solution is feasible, in terms of performance and security, and can be expanded and refined for practical use and integration with cloud synchronization services

    Implementation of Deduplication on Encrypted Big-data using Signcryption for cloud storage applications

    Get PDF
    As Big Data Cloud storage servers are getting widespread the shortage of disc space within the cloud becomes a major concern. The elimination of duplicate or redundant data, particularly in computer data is named deduplication. Data deduplication is a method to regulate the explosive growth of information within the cloud storage, most of the storage providers are finding more secure and efficient methods for their sensitive method. Recently, a noteworthy technique referred to as signcryption has been proposed, in which both the properties of signature (ownership) and encryption are simultaneously implemented with better performance According to deduplication, we introduce a method that can eliminate redundant encrypted data owned by different users. Furthermore, we generate a tag which will be the key component of big data management. We propose a technique called digital signature for ownership verification. Convergent encryption also called for a content hash key cryptosystem. Convergent encryption is an encryption approach that supports deduplication. With this encryption technique, the encryption key is generated out of a hash of plain text. Therefore applying this technique, identical plaintexts would turn out the same ciphertext

    Implementation of Deduplication on Encrypted Big-data using Signcryption for cloud storage applications

    Get PDF
    As Big Data Cloud storage servers are getting widespread the shortage of disc space within the cloud becomes a major concern. The elimination of duplicate or redundant data, particularly in computer data is named deduplication. Data deduplication is a method to regulate the explosive growth of information within the cloud storage, most of the storage providers are finding more secure and efficient methods for their sensitive method. Recently, a noteworthy technique referred to as signcryption has been proposed, in which both the properties of signature (ownership) and encryption are simultaneously implemented with better performance According to deduplication, we introduce a method that can eliminate redundant encrypted data owned by different users. Furthermore, we generate a tag which will be the key component of big data management. We propose a technique called digital signature for ownership verification. Convergent encryption also called for a content hash key cryptosystem. Convergent encryption is an encryption approach that supports deduplication. With this encryption technique, the encryption key is generated out of a hash of plain text. Therefore applying this technique, identical plaintexts would turn out the same ciphertext

    Resumption of virtual machines after adaptive deduplication of virtual machine images in live migration

    Get PDF
    In cloud computing, load balancing, energy utilization are the critical problems solved by virtual machine (VM) migration. Live migration is the live movement of VMs from an overloaded/underloaded physical machine to a suitable one. During this process, transferring large disk image files take more time, hence more migration and down time. In the proposed adaptive deduplication, based on the image file size, the file undergoes both fixed, variable length deduplication processes. The significance of this paper is resumption of VMs with reunited deduplicated disk image files. The performance measured by calculating the percentage reduction of VM image size after deduplication, the time taken to migrate the deduplicated file and the time taken for each VM to resume after the migration. The results show that 83%, 89.76% reduction overall image size and migration time respectively. For a deduplication ratio of 92%, it takes an overall time of 3.52 minutes, 7% reduction in resumption time, compared with the time taken for the total QCOW2 files with original size. For VMDK files the resumption time reduced by a maximum 17% (7.63 mins) compared with that of for original files

    A Review on Deduplication-Cost Efficient Method to Store Data Over Cloud Using Convergent Encryption

    Get PDF
    This paper represents that, many techniques are using for the elimination of duplicate copies of repeating data, out of those techniques, the most important data compression technique is data deduplication. Convergent technique has been used to encrypt data before outsourcing for privacy and security point of view. In the proposed system, we apply the technique of cryptographic tuning to make the encryption more secure and flexible. In previous systems, there was a limitation of convergent encryption. Data deduplication does not allow the storage of repetitive blocks. It also puts the pointer to the existing blocks so that the data owner have the freedom of selecting users, to have access to the published file. Access control is provided into the application. The integrity of data outsourced to the cloud is managed by the hash calculation of any content following the proof-of-ownership module. Proposed system calculates the hash value of the data content on both sides i.e.; destination as well as source side. Request hash for the cloud side to predict the tampering of data. The expected analysis shows the improvement in execution time and development cost
    • …
    corecore