9,157 research outputs found

    EsPRESSo: Efficient Privacy-Preserving Evaluation of Sample Set Similarity

    Full text link
    Electronic information is increasingly often shared among entities without complete mutual trust. To address related security and privacy issues, a few cryptographic techniques have emerged that support privacy-preserving information sharing and retrieval. One interesting open problem in this context involves two parties that need to assess the similarity of their datasets, but are reluctant to disclose their actual content. This paper presents an efficient and provably-secure construction supporting the privacy-preserving evaluation of sample set similarity, where similarity is measured as the Jaccard index. We present two protocols: the first securely computes the (Jaccard) similarity of two sets, and the second approximates it, using MinHash techniques, with lower complexities. We show that our novel protocols are attractive in many compelling applications, including document/multimedia similarity, biometric authentication, and genetic tests. In the process, we demonstrate that our constructions are appreciably more efficient than prior work.Comment: A preliminary version of this paper was published in the Proceedings of the 7th ESORICS International Workshop on Digital Privacy Management (DPM 2012). This is the full version, appearing in the Journal of Computer Securit

    Compositional closure for Bayes Risk in probabilistic noninterference

    Full text link
    We give a sequential model for noninterference security including probability (but not demonic choice), thus supporting reasoning about the likelihood that high-security values might be revealed by observations of low-security activity. Our novel methodological contribution is the definition of a refinement order and its use to compare security measures between specifications and (their supposed) implementations. This contrasts with the more common practice of evaluating the security of individual programs in isolation. The appropriateness of our model and order is supported by our showing that our refinement order is the greatest compositional relation --the compositional closure-- with respect to our semantics and an "elementary" order based on Bayes Risk --- a security measure already in widespread use. We also relate refinement to other measures such as Shannon Entropy. By applying the approach to a non-trivial example, the anonymous-majority Three-Judges protocol, we demonstrate by example that correctness arguments can be simplified by the sort of layered developments --through levels of increasing detail-- that are allowed and encouraged by compositional semantics

    Secure Two-Party Protocol for Privacy-Preserving Classification via Differential Privacy

    Get PDF
    Privacy-preserving distributed data mining is the study of mining on distributed data—owned by multiple data owners—in a non-secure environment, where the mining protocol does not reveal any sensitive information to the data owners, the individual privacy is preserved, and the output mining model is practically useful. In this thesis, we propose a secure two-party protocol for building a privacy-preserving decision tree classifier over distributed data using differential privacy. We utilize secure multiparty computation to ensure that the protocol is privacy-preserving. Our algorithm also utilizes parallel and sequential compositions, and applies distributed exponential mechanism to ensure that the output is differentially-private. We implemented our protocol in a distributed environment on real-life data, and the experimental results show that the protocol produces decision tree classifiers with high utility while being reasonably efficient and scalable

    PRECEPT:a framework for ethical digital forensics investigations

    Get PDF
    Purpose: Cyber-enabled crimes are on the increase, and law enforcement has had to expand many of their detecting activities into the digital domain. As such, the field of digital forensics has become far more sophisticated over the years and is now able to uncover even more evidence that can be used to support prosecution of cyber criminals in a court of law. Governments, too, have embraced the ability to track suspicious individuals in the online world. Forensics investigators are driven to gather data exhaustively, being under pressure to provide law enforcement with sufficient evidence to secure a conviction. Yet, there are concerns about the ethics and justice of untrammeled investigations on a number of levels. On an organizational level, unconstrained investigations could interfere with, and damage, the organization’s right to control the disclosure of their intellectual capital. On an individual level, those being investigated could easily have their legal privacy rights violated by forensics investigations. On a societal level, there might be a sense of injustice at the perceived inequality of current practice in this domain. This paper argues the need for a practical, ethically-grounded approach to digital forensic investigations, one that acknowledges and respects the privacy rights of individuals and the intellectual capital disclosure rights of organisations, as well as acknowledging the needs of law enforcement. We derive a set of ethical guidelines, then map these onto a forensics investigation framework. We subjected the framework to expert review in two stages, refining the framework after each stage. We conclude by proposing the refined ethically-grounded digital forensics investigation framework. Our treatise is primarily UK based, but the concepts presented here have international relevance and applicability.Design methodology: In this paper, the lens of justice theory is used to explore the tension that exists between the needs of digital forensic investigations into cybercrimes on the one hand, and, on the other, individuals’ rights to privacy and organizations’ rights to control intellectual capital disclosure.Findings: The investigation revealed a potential inequality between the practices of digital forensics investigators and the rights of other stakeholders. That being so, the need for a more ethically-informed approach to digital forensics investigations, as a remedy, is highlighted, and a framework proposed to provide this.Practical Implications: Our proposed ethically-informed framework for guiding digital forensics investigations suggest a way of re-establishing the equality of the stakeholders in this arena, and ensuring that the potential for a sense of injustice is reduced.Originality/value: Justice theory is used to highlight the difficulties in squaring the circle between the rights and expectations of all stakeholders in the digital forensics arena. The outcome is the forensics investigation guideline, PRECEpt: Privacy-Respecting EthiCal framEwork, which provides the basis for a re-aligning of the balance between the requirements and expectations of digital forensic investigators on the one hand, and individual and organizational expectations and rights, on the other

    Distributed Private Heavy Hitters

    Full text link
    In this paper, we give efficient algorithms and lower bounds for solving the heavy hitters problem while preserving differential privacy in the fully distributed local model. In this model, there are n parties, each of which possesses a single element from a universe of size N. The heavy hitters problem is to find the identity of the most common element shared amongst the n parties. In the local model, there is no trusted database administrator, and so the algorithm must interact with each of the nn parties separately, using a differentially private protocol. We give tight information-theoretic upper and lower bounds on the accuracy to which this problem can be solved in the local model (giving a separation between the local model and the more common centralized model of privacy), as well as computationally efficient algorithms even in the case where the data universe N may be exponentially large
    • …
    corecore