8,880 research outputs found

    KBD-Share: Key Aggregation, Blockchain, and Differential Privacy based Secured Data Sharing for Multi-User Cloud Computing

    Get PDF
    In today's era of widespread cloud computing and data sharing, the demand for secure and privacy-preserving techniques to facilitate multi-user data sharing is rapidly increasing. However, traditional approaches struggle to effectively address the twin objectives of ensuring privacy protection while preserving the utility of shared data. This predicament holds immense significance due to the pivotal role data sharing plays in diverse domains and applications. However, it also brings about significant privacy vulnerabilities. Consequently, innovative approaches are imperative to achieve a harmonious equilibrium between the utility of shared data and the protection of privacy in scenarios involving multiple users. This paper presents KBD-Share, an innovative framework that addresses the intricacies of ensuring data security and privacy in the context of sharing data among multiple users in cloud computing environments. By seamlessly integrating key aggregation, blockchain technology, and differential privacy techniques, KBD-Share offers an efficient and robust solution to protect sensitive data while facilitating seamless sharing and utilization. Extensive experimental evaluations convincingly establish the superiority of KBD-Share in aspects of data privacy preservation and utility, outperforming existing approaches. This approach achieves the highest R2 value of 0.9969 exhibiting best data utility, essential for multi-user data sharing in diverse cloud computing applications

    End-to-End Privacy for Open Big Data Markets

    Get PDF
    The idea of an open data market envisions the creation of a data trading model to facilitate exchange of data between different parties in the Internet of Things (IoT) domain. The data collected by IoT products and solutions are expected to be traded in these markets. Data owners will collect data using IoT products and solutions. Data consumers who are interested will negotiate with the data owners to get access to such data. Data captured by IoT products will allow data consumers to further understand the preferences and behaviours of data owners and to generate additional business value using different techniques ranging from waste reduction to personalized service offerings. In open data markets, data consumers will be able to give back part of the additional value generated to the data owners. However, privacy becomes a significant issue when data that can be used to derive extremely personal information is being traded. This paper discusses why privacy matters in the IoT domain in general and especially in open data markets and surveys existing privacy-preserving strategies and design techniques that can be used to facilitate end to end privacy for open data markets. We also highlight some of the major research challenges that need to be address in order to make the vision of open data markets a reality through ensuring the privacy of stakeholders.Comment: Accepted to be published in IEEE Cloud Computing Magazine: Special Issue Cloud Computing and the La

    PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks

    Full text link
    Federated Learning (FL) enables a large number of users to jointly learn a shared machine learning (ML) model, coordinated by a centralized server, where the data is distributed across multiple devices. This approach enables the server or users to train and learn an ML model using gradient descent, while keeping all the training data on users' devices. We consider training an ML model over a mobile network where user dropout is a common phenomenon. Although federated learning was aimed at reducing data privacy risks, the ML model privacy has not received much attention. In this work, we present PrivFL, a privacy-preserving system for training (predictive) linear and logistic regression models and oblivious predictions in the federated setting, while guaranteeing data and model privacy as well as ensuring robustness to users dropping out in the network. We design two privacy-preserving protocols for training linear and logistic regression models based on an additive homomorphic encryption (HE) scheme and an aggregation protocol. Exploiting the training algorithm of federated learning, at the core of our training protocols is a secure multiparty global gradient computation on alive users' data. We analyze the security of our training protocols against semi-honest adversaries. As long as the aggregation protocol is secure under the aggregation privacy game and the additive HE scheme is semantically secure, PrivFL guarantees the users' data privacy against the server, and the server's regression model privacy against the users. We demonstrate the performance of PrivFL on real-world datasets and show its applicability in the federated learning system.Comment: In Proceedings of the 2019 ACM SIGSAC Conference on Cloud Computing Security Workshop (CCSW'19
    • …
    corecore