3,022 research outputs found

    Efficient and Privacy-Preserving Ride Sharing Organization for Transferable and Non-Transferable Services

    Full text link
    Ride-sharing allows multiple persons to share their trips together in one vehicle instead of using multiple vehicles. This can reduce the number of vehicles in the street, which consequently can reduce air pollution, traffic congestion and transportation cost. However, a ride-sharing organization requires passengers to report sensitive location information about their trips to a trip organizing server (TOS) which creates a serious privacy issue. In addition, existing ride-sharing schemes are non-flexible, i.e., they require a driver and a rider to have exactly the same trip to share a ride. Moreover, they are non-scalable, i.e., inefficient if applied to large geographic areas. In this paper, we propose two efficient privacy-preserving ride-sharing organization schemes for Non-transferable Ride-sharing Services (NRS) and Transferable Ride-sharing Services (TRS). In the NRS scheme, a rider can share a ride from its source to destination with only one driver whereas, in TRS scheme, a rider can transfer between multiple drivers while en route until he reaches his destination. In both schemes, the ride-sharing area is divided into a number of small geographic areas, called cells, and each cell has a unique identifier. Each driver/rider should encrypt his trip's data and send an encrypted ride-sharing offer/request to the TOS. In NRS scheme, Bloom filters are used to compactly represent the trip information before encryption. Then, the TOS can measure the similarity between the encrypted trips data to organize shared rides without revealing either the users' identities or the location information. In TRS scheme, drivers report their encrypted routes, an then the TOS builds an encrypted directed graph that is passed to a modified version of Dijkstra's shortest path algorithm to search for an optimal path of rides that can achieve a set of preferences defined by the riders

    Achieving Secure and Efficient Cloud Search Services: Cross-Lingual Multi-Keyword Rank Search over Encrypted Cloud Data

    Full text link
    Multi-user multi-keyword ranked search scheme in arbitrary language is a novel multi-keyword rank searchable encryption (MRSE) framework based on Paillier Cryptosystem with Threshold Decryption (PCTD). Compared to previous MRSE schemes constructed based on the k-nearest neighbor searcha-ble encryption (KNN-SE) algorithm, it can mitigate some draw-backs and achieve better performance in terms of functionality and efficiency. Additionally, it does not require a predefined keyword set and support keywords in arbitrary languages. However, due to the pattern of exact matching of keywords in the new MRSE scheme, multilingual search is limited to each language and cannot be searched across languages. In this pa-per, we propose a cross-lingual multi-keyword rank search (CLRSE) scheme which eliminates the barrier of languages and achieves semantic extension with using the Open Multilingual Wordnet. Our CLRSE scheme also realizes intelligent and per-sonalized search through flexible keyword and language prefer-ence settings. We evaluate the performance of our scheme in terms of security, functionality, precision and efficiency, via extensive experiments

    EsPRESSo: Efficient Privacy-Preserving Evaluation of Sample Set Similarity

    Full text link
    Electronic information is increasingly often shared among entities without complete mutual trust. To address related security and privacy issues, a few cryptographic techniques have emerged that support privacy-preserving information sharing and retrieval. One interesting open problem in this context involves two parties that need to assess the similarity of their datasets, but are reluctant to disclose their actual content. This paper presents an efficient and provably-secure construction supporting the privacy-preserving evaluation of sample set similarity, where similarity is measured as the Jaccard index. We present two protocols: the first securely computes the (Jaccard) similarity of two sets, and the second approximates it, using MinHash techniques, with lower complexities. We show that our novel protocols are attractive in many compelling applications, including document/multimedia similarity, biometric authentication, and genetic tests. In the process, we demonstrate that our constructions are appreciably more efficient than prior work.Comment: A preliminary version of this paper was published in the Proceedings of the 7th ESORICS International Workshop on Digital Privacy Management (DPM 2012). This is the full version, appearing in the Journal of Computer Securit

    Privacy-Preserving Blockchain-Based Registration Scheme for AV Parking System

    Get PDF
    Autonomous Vehicles (AV) are a prime example of how innovation and automation are at the forefront of growing technology trends. The concern of parking systems is becoming apparent as research into ways to increase the efficiency and cost-effectiveness of AV continues. To ward against various internet attackers and secure users\u27 sensitive information, an efficient AV parking system must have powerful user privacy and cyber security capabilities. In my work, I present a blockchain-based privacy registration system for AV parking systems that meets the following criteria. The proposed scheme incorporates k-Nearest Neighbor (kNN) - an efficient and lightweight algorithm - for encrypting and matching available parking slots of participating AV parking lots with the parking spaces of interest to AV users using vector matrices. Additionally, the incorporated blockchain eliminates the need for financial third parties and ensures secure payment fairness and transparency between the AV and parking lot. The proposed approach is also shown to be robust and efficient, according to our security and privacy analysis. Keywords: Blockchain, Parking Reservation, Autonomous Vehicles (AV), k-Nearest Neighbor (kNN), Parking Cloud Server (PCS

    Confidential Boosting with Random Linear Classifiers for Outsourced User-generated Data

    Full text link
    User-generated data is crucial to predictive modeling in many applications. With a web/mobile/wearable interface, a data owner can continuously record data generated by distributed users and build various predictive models from the data to improve their operations, services, and revenue. Due to the large size and evolving nature of users data, data owners may rely on public cloud service providers (Cloud) for storage and computation scalability. Exposing sensitive user-generated data and advanced analytic models to Cloud raises privacy concerns. We present a confidential learning framework, SecureBoost, for data owners that want to learn predictive models from aggregated user-generated data but offload the storage and computational burden to Cloud without having to worry about protecting the sensitive data. SecureBoost allows users to submit encrypted or randomly masked data to designated Cloud directly. Our framework utilizes random linear classifiers (RLCs) as the base classifiers in the boosting framework to dramatically simplify the design of the proposed confidential boosting protocols, yet still preserve the model quality. A Cryptographic Service Provider (CSP) is used to assist the Cloud's processing, reducing the complexity of the protocol constructions. We present two constructions of SecureBoost: HE+GC and SecSh+GC, using combinations of homomorphic encryption, garbled circuits, and random masking to achieve both security and efficiency. For a boosted model, Cloud learns only the RLCs and the CSP learns only the weights of the RLCs. Finally, the data owner collects the two parts to get the complete model. We conduct extensive experiments to understand the quality of the RLC-based boosting and the cost distribution of the constructions. Our results show that SecureBoost can efficiently learn high-quality boosting models from protected user-generated data

    Homomorphic-Encrypted Volume Rendering

    Full text link
    Computationally demanding tasks are typically calculated in dedicated data centers, and real-time visualizations also follow this trend. Some rendering tasks, however, require the highest level of confidentiality so that no other party, besides the owner, can read or see the sensitive data. Here we present a direct volume rendering approach that performs volume rendering directly on encrypted volume data by using the homomorphic Paillier encryption algorithm. This approach ensures that the volume data and rendered image are uninterpretable to the rendering server. Our volume rendering pipeline introduces novel approaches for encrypted-data compositing, interpolation, and opacity modulation, as well as simple transfer function design, where each of these routines maintains the highest level of privacy. We present performance and memory overhead analysis that is associated with our privacy-preserving scheme. Our approach is open and secure by design, as opposed to secure through obscurity. Owners of the data only have to keep their secure key confidential to guarantee the privacy of their volume data and the rendered images. Our work is, to our knowledge, the first privacy-preserving remote volume-rendering approach that does not require that any server involved be trustworthy; even in cases when the server is compromised, no sensitive data will be leaked to a foreign party.Comment: Accepted for presentation at IEEE VIS 202

    Privacy Preserving Inference for Deep Neural Networks:Optimizing Homomorphic Encryption for Efficient and Secure Classification

    Get PDF
    The application of machine learning in healthcare, financial, social media, and other sensitive sectors not only involves high accuracy but privacy as well. Due to the emergence of the Cloud as a computation and one-to-many access paradigm; training and classification/inference tasks have been outsourced to Cloud. However, its usage is limited due to legal and ethical constraints regarding privacy. In this work, we propose a privacy-preserving neural networks-based classification model based on Homomorphic Encryption (HE) where the user can send an encrypted instance to the cloud and receive an encrypted inference from it to preserve the user’s query privacy. In contrast to existing works, we demonstrate the realistic limitations of HE for privacy-preserving machine learning by changing its parameters for enhanced security and accuracy. We showcase scenarios where the choice of HE parameters impedes accurate classification and present an optimized setting for achieving reliable classification. We present several results to demonstrate its effectiveness using MNIST dataset with highly improved inference time for a query as compared to the state of the art

    User-centric privacy preservation in Internet of Things Networks

    Get PDF
    Recent trends show how the Internet of Things (IoT) and its services are becoming more omnipresent and popular. The end-to-end IoT services that are extensively used include everything from neighborhood discovery to smart home security systems, wearable health monitors, and connected appliances and vehicles. IoT leverages different kinds of networks like Location-based social networks, Mobile edge systems, Digital Twin Networks, and many more to realize these services. Many of these services rely on a constant feed of user information. Depending on the network being used, how this data is processed can vary significantly. The key thing to note is that so much data is collected, and users have little to no control over how extensively their data is used and what information is being used. This causes many privacy concerns, especially for a na ̈ıve user who does not know the implications and consequences of severe privacy breaches. When designing privacy policies, we need to understand the different user data types used in these networks. This includes user profile information, information from their queries used to get services (communication privacy), and location information which is much needed in many on-the-go services. Based on the context of the application, and the service being provided, the user data at risk and the risks themselves vary. First, we dive deep into the networks and understand the different aspects of privacy for user data and the issues faced in each such aspect. We then propose different privacy policies for these networks and focus on two main aspects of designing privacy mechanisms: The quality of service the user expects and the private information from the user’s perspective. The novel contribution here is to focus on what the user thinks and needs instead of fixating on designing privacy policies that only satisfy the third-party applications’ requirement of quality of service
    • …
    corecore