1,589 research outputs found

    Remote Data Integrity Checking in Cloud Computing

    Get PDF
    Cloud computing is an internet based computing which enables sharing of services. It is very challenging part to keep safely all required data that are needed in many applica f or user in cloud. Storing our data in cloud may not be fully trustworthy. Since client doesnt have copy of all stored data, he has to depend on Cloud Service Provider. This work studies the problem of ensuring the integrity and security of data storage in Cloud Computing. This paper, proposes an effective and flexible Batch Audit sche me with dynamic data support to reduce the computation overheads. To ensure the correctness of users data the task of allowing a third party auditor (TPA), on behalf of the cloud client, to verify the integrity of the data stored in the cloud. We consider symmetric encryption for effective utilization of outsourced cloud data under the model, it achieve the storage security in multi cloud data storage. The new scheme further supports secure and efficient dynamic operation sondata blocks, including data i nserti on, update,delete and replacement. Extensive securityand performance analysis shows that the proposed sche me is highlyef ficient and resilient again st By zantinef ailure, maliciousd a ta modification at tack, and even server colliding a ttacks

    Blockchain & Multi-Agent System: A New Promising Approach for Cloud Data Integrity Auditing with Deduplication

    Get PDF
    Recently, data storage represents one of the most important services in Cloud Computing. The cloud provider should ensure two major requirements which are data integrity and storage efficiency. Blockchain data structure and the efficient data deduplication represent possible solutions to address these exigencies. Several approaches have been proposed, some of them implement deduplication in Cloud server side, which involves a lot of computation to eliminate the redundant data and it becomes more and more complex. Therefore, this paper proposed an efficient, reliable and secure approach, in which the authors propose a Multi-Agent System in order to manipulate deduplication technique that permits to reduce data volumes thereby reduce storage overhead. On the other side, the loss of physical control over data introduces security challenges such as data loss, data tampering and data modification. To solve similar problems, the authors also propose Blockchain as a database for storing metadata of client files. This database serves as logging database that ensures data integrity auditing function

    Light-Weight Accountable Privacy Preserving Protocol in Cloud Computing Based on a Third-Party Auditor

    Get PDF
    Cloud computing is emerging as the next disruptive utility paradigm [1]. It provides extensive storage capabilities and an environment for application developers through virtual machines. It is also the home of software and databases that are accessible, on-demand. Cloud computing has drastically transformed the way organizations, and individual consumers access and interact with Information Technology. Despite significant advancements in this technology, concerns about security are holding back businesses from fully adopting this promising information technology trend. Third-party auditors (TPAs) are becoming more common in cloud computing implementations. Hence, involving auditors comes with its issues such as trust and processing overhead. To achieve productive auditing, we need to (1) accomplish efficient auditing without requesting the data location or introducing processing overhead to the cloud client; (2) avoid introducing new security vulnerabilities during the auditing process. There are various security models for safeguarding the CCs (Cloud Client) data in the cloud. The TPA systematically examines the evidence of compliance with established security criteria in the connection between the CC and the Cloud Service Provider (CSP). The CSP provides the clients with cloud storage, access to a database coupled with services. Many security models have been elaborated to make the TPA more reliable so that the clients can trust the third-party auditor with their data. Our study shows that involving a TPA might come with its shortcomings, such as trust concerns, extra overhead, security, and data manipulation breaches; as well as additional processing, which leads to the conclusion that a lightweight and secure protocol is paramount to the solution. As defined in [2] privacy-preserving is making sure that the three cloud stakeholders are not involved in any malicious activities coming from insiders at the CSP level, making sure to remediate to TPA vulnerabilities and that the CC is not deceitfully affecting other clients. In our survey phase, we have put into perspective the privacy-preserving solutions as they fit the lightweight requirements in terms of processing and communication costs, ending up by choosing the most prominent ones to compare with them our simulation results. In this dissertation, we introduce a novel method that can detect a dishonest TPA: The Light-weight Accountable Privacy-Preserving (LAPP) Protocol. The lightweight characteristic has been proven simulations as the minor impact of our protocol in terms of processing and communication costs. This protocol determines the malicious behavior of the TPA. To validate our proposed protocol’s effectiveness, we have conducted simulation experiments by using the GreenCloud simulator. Based on our simulation results, we confirm that our proposed model provides better outcomes as compared to the other known contending methods

    Remote Data Auditing in a Cloud Computing Environment

    Get PDF
    In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud servers. There are four different techniques of remote data auditing procedures that are presented here for distributed cloud services. There are several difficulties associated with data audit methods; however, these difficulties may be overcome by using a variety of techniques, such as the Boneh-Lynn-Shacham signature or the automated blocker protocol. In addition to that, other difficulties associated with distributed-based remote data auditing solutions are discussed. In addition, a variety of approaches might be researched further for further examination in order to find answers to these impending problems

    Double Encryption Based Auditing Protocol Using Dynamic Operation in Cloud Storage

    Get PDF
    Using Cloud Storage, users can tenuously store their data and enjoy the on-demand great quality applications and facilities from a shared pool of configurable computing resources, without the problem of local data storage and maintenance. However, the fact that users no longer have physical possession of the outsourced data makes the data integrity protection in Cloud Computing a formidable task, especially for users with constrained dividing resources. From users? perspective, including both individuals and IT systems, storing data remotely into the cloud in a flexible on-demand manner brings tempting benefits: relief of the burden for storage management, universal data access with independent geographical locations, and avoidance of capital expenditure on hardware, software, and personnel maintenances, etc. . To securely introduce an effective third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to capably audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user; 2) The third party auditing process should take in no new vulnerabilities towards user data privacy. In this project, utilize and uniquely combine the public auditing protocols with double encryption approach to achieve the privacy-preserving public cloud data auditing system, which meets all integrity checking without any leakage of data. To support efficient handling of multiple auditing tasks, we further explore the technique of online signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously. We can implement double encryption algorithm encrypt the data twice and stored cloud server

    A Framework for Uncertain Cloud Data Security and Recovery Based on Hybrid Multi-User Medical Decision Learning Patterns

    Get PDF
    Machine learning has been supporting real-time cloud based medical computing systems. However, most of the computing servers are independent of data security and recovery scheme in multiple virtual machines due to high computing cost and time. Also, this cloud based medical applications require static security parameters for cloud data security. Cloud based medical applications require multiple servers to store medical records or machine learning patterns for decision making. Due to high Uncertain computational memory and time, these cloud systems require an efficient data security framework to provide strong data access control among the multiple users. In this work, a hybrid cloud data security framework is developed to improve the data security on the large machine learning patterns in real-time cloud computing environment. This work is implemented in two phases’ i.e. data replication phase and multi-user data access security phase. Initially, machine decision patterns are replicated among the multiple servers for Uncertain data recovering phase. In the multi-access cloud data security framework, a hybrid multi-access key based data encryption and decryption model is implemented on the large machine learning medical patterns for data recovery and security process. Experimental results proved that the present two-phase data recovering, and security framework has better computational efficiency than the conventional approaches on large medical decision patterns

    Secure Data Storage on Cloud through Networking

    Get PDF
    Security, privacy issue and data protection is always one of the major issue which reduces the growth and make slow the speed of rising new technologies in the field of cloud computing. The advent of an advanced model should not negotiate with the required functionalities and capabilities present in the current model. Here to avoid risk and threaten are reduced in the new model the features are improved. In this paper, a survey of the different security risks that pose a threat to the cloud is presented. This paper is a survey more specific to the different security issues that has emanated due to the nature of the service delivery models of a cloud computing system
    • …
    corecore