13 research outputs found

    Efficient and Privacy Preserving Group Signature for Federated Learning

    Full text link
    Federated Learning (FL) is a Machine Learning (ML) technique that aims to reduce the threats to user data privacy. Training is done using the raw data on the users' device, called clients, and only the training results, called gradients, are sent to the server to be aggregated and generate an updated model. However, we cannot assume that the server can be trusted with private information, such as metadata related to the owner or source of the data. So, hiding the client information from the server helps reduce privacy-related attacks. Therefore, the privacy of the client's identity, along with the privacy of the client's data, is necessary to make such attacks more difficult. This paper proposes an efficient and privacy-preserving protocol for FL based on group signature. A new group signature for federated learning, called GSFL, is designed to not only protect the privacy of the client's data and identity but also significantly reduce the computation and communication costs considering the iterative process of federated learning. We show that GSFL outperforms existing approaches in terms of computation, communication, and signaling costs. Also, we show that the proposed protocol can handle various security attacks in the federated learning environment

    Hybrid biometric template protection:Resolving the agony of choice between bloom filters and homomorphic encryption

    Get PDF
    Abstract Bloom filters (BFs) and homomorphic encryption (HE) are prominent techniques used to design biometric template protection (BTP) schemes that aim to protect sensitive biometric information during storage and biometric comparison. However, the pros and cons of BF‐ and HE‐based BTPs are not well studied in literature. We investigate the strengths and weaknesses of these two approaches since both seem promising from a theoretical viewpoint. Our key insight is to extend our theoretical investigation to cover the practical case of iris recognition on the ground that iris (1) benefits from the alignment‐free property of BFs and (2) induces huge computational burdens when implemented in the HE‐encrypted domain. BF‐based BTPs can be implemented to be either fast with high recognition accuracy while missing the important privacy property of ‘unlinkability’, or to be fast with unlinkability‐property while missing the high accuracy. HE‐based BTPs, on the other hand, are highly secure, achieve good accuracy, and meet the unlinkability‐property, but they are much slower than BF‐based approaches. As a synthesis, we propose a hybrid BTP scheme that combines the good properties of BFs and HE, ensuring unlinkability and high recognition accuracy, while being about seven times faster than the traditional HE‐based approach

    Privacy enhancing technologies : protocol verification, implementation and specification

    Get PDF
    In this thesis, we present novel methods for verifying, implementing and specifying protocols. In particular, we focus properties modeling data protection and the protection of privacy. In the first part of the thesis, the author introduces protocol verification and presents a model for verification that encompasses so-called Zero-Knowledge (ZK) proofs. These ZK proofs are a cryptographic primitive that is particularly suited for hiding information and hence serves the protection of privacy. The here presented model gives a list of criteria which allows the transfer of verification results from the model to the implementation if the criteria are met by the implementation. In particular, the criteria are less demanding than the ones of previous work regarding ZK proofs. The second part of the thesis contributes to the area of protocol implementations. Hereby, ZK proofs are used in order to improve multi-party computations. The third and last part of the thesis explains a novel approach for specifying data protection policies. Instead of relying on policies, this approach relies on actual legislation. The advantage of relying on legislation is that often a fair balancing is introduced which is typically not contained in regulations or policies.In dieser Arbeit werden neue Methoden zur Verifikation, Implementierung und Spezifikation im von Protokollen vorgestellt. Ein besonderer Fokus liegt dabei auf Datenschutz-Eigenschaften und dem Schutz der Privatsph¨are. Im ersten Teil dieser Arbeit geht der Author auf die Protokoll- Verifikation ein und stellt ein Modell zur Verifikation vor, dass sogenannte Zero-Knowledge (ZK) Beweise enth¨alt. Diese ZK Beweise sind ein kryptographisches primitiv, dass insbesondere zum Verstecken von Informationen geeignet ist und somit zum Schutz der Privatsph¨are dient. Das hier vorgestellte Modell gibt eine Liste von Kriterien, welche eine Implementierung der genutzten kryptographischen Primitive erf¨ullen muss, damit die verifikationen im Modell sich auf Implementierungen ¨ubertragen lassen. In Bezug auf ZK Beweise sind diese Kriterien sch¨acher als die vorangegangener Arbeiten. Der zweite Teil der Arbeit wendet sich der Implementierung von Protokollen zu. Hierbei werden dann ZK Beweise verwendet um sichere Mehrparteienberechnungen zu verbessern. Im dritten und letzten Teil der Arbeit wird eine neuartige Art der Spezifikation von Datenschutz-Richtlinien erl¨autert. Diese geht nicht von Richtlinien aus, sondern von der Rechtsprechung. Der Vorteil ist, dass in der Rechtsprechung konkrete Abw¨agungen getroffen werden, die Gesetze und Richtlinien nicht enthalten

    Zero Knowledge Protocols and Applications

    Get PDF
    The historical goal of cryptography is to securely transmit or store a message in an insecure medium. In that era, before public key cryptography, we had two kinds of people: those who had the correct key, and those who did not. Nowadays however, we live in a complex world with equally complex goals and requirements: securely passing a note from Alice to Bob is not enough. We want Alice to use her smartphone to vote for Carol, without Bob the tallier, or anyone else learning her vote; we also want guarantees that Alice’s ballot contains a single, valid vote and we want guarantees that Bob will tally the ballots properly. This is in fact made possible because of zero knowledge protocols. This thesis presents research performed in the area of zero knowledge protocols across the following threads: we relax the assumptions necessary for the Damgard, Fazio and ˚ Nicolosi (DFN) transformation, a technique which enables one to collapse a number of three round protocols into a single message. This approach is motivated by showing how it could be used as part of a voting scheme. Then we move onto a protocol that lets us prove that a given computation (modeled as an arithmetic circuit) was performed correctly. It improves upon the state of the art in the area by significantly reducing the communication cost. A second strand of research concerns multi-user signatures, which enable a signer to sign with respect to a set of users. We give new definitions for important primitives in the area as well as efficient instantiations using zero knowledge protocols. Finally, we present two possible answers to the question posed by voting receipts. One is to maximise privacy by building a voting system that provides receipt-freeness automatically. The other is to use them to enable conventual and privacy preserving vote copying

    Commuting Signatures and Verifiable Encryption and an Application to Non-Interactively Delegatable Credentials

    Get PDF
    Verifiable encryption allows to encrypt a signature and prove that the plaintext is valid. We introduce a new primitive called commuting signature that extends verifiable encryption in multiple ways: a signer can encrypt both signature and message and prove validity; more importantly, given a ciphertext, a signer can create a verifiably encrypted signature on the encrypted message; thus signing and encrypting commute. We instantiate commuting signatures using the proof system by Groth and Sahai (EUROCRYPT \u2708) and the automorphic signatures by Fuchsbauer (ePrint report 2009/320). As an application, we give an instantiation of delegatable anonymous credentials, a powerful primitive introduced by Belenkiy et al. (CRYPTO \u2709). Our instantiation is arguably simpler than theirs and it is the first to provide non-interactive issuing and delegation, which is a standard requirement for non-anonymous credentials. Moreover, the size of our credentials and the cost of verification are less than half of those of the only previous construction, and efficiency of issuing and delegation is increased even more significantly. All our constructions are proved secure in the standard model

    Les preuves de protocoles cryprographiques revisitées

    Get PDF
    With the rise of the Internet the use of cryptographic protocols became ubiquitous. Considering the criticality and complexity of these protocols, there is an important need of formal verification.In order to obtain formal proofs of cryptographic protocols, two main attacker models exist: the symbolic model and the computational model. The symbolic model defines the attacker capabilities as a fixed set of rules. On the other hand, the computational model describes only the attacker's limitations by stating that it may break some hard problems. While the former is quiteabstract and convenient for automating proofs the later offers much stronger guarantees.There is a gap between the guarantees offered by these two models due to the fact the symbolic model defines what the adversary may do while the computational model describes what it may not do. In 2012 Bana and Comon devised a new symbolic model in which the attacker's limitations are axiomatised. In addition provided that the (computational semantics) of the axioms follows from the cryptographic hypotheses, proving security in this symbolic model yields security in the computational model.The possibility of automating proofs in this model (and finding axioms general enough to prove a large class of protocols) was left open in the original paper. In this thesis we provide with an efficient decision procedure for a general class of axioms. In addition we propose a tool (SCARY) implementing this decision procedure. Experimental results of our tool shows that the axioms we designed for modelling security of encryption are general enough to prove a large class of protocols.Avec la généralisation d'Internet, l'usage des protocoles cryptographiques est devenu omniprésent. Étant donné leur complexité et leur l'aspect critique, une vérification formelle des protocoles cryptographiques est nécessaire.Deux principaux modèles existent pour prouver les protocoles. Le modèle symbolique définit les capacités de l'attaquant comme un ensemble fixe de règles, tandis que le modèle calculatoire interdit seulement a l'attaquant derésoudre certain problèmes difficiles. Le modèle symbolique est très abstrait et permet généralement d'automatiser les preuves, tandis que le modèle calculatoire fournit des garanties plus fortes.Le fossé entre les garanties offertes par ces deux modèles est dû au fait que le modèle symbolique décrit les capacités de l'adversaire alors que le modèle calculatoire décrit ses limitations. En 2012 Bana et Comon ont proposé unnouveau modèle symbolique dans lequel les limitations de l'attaquant sont axiomatisées. De plus, si la sémantique calculatoire des axiomes découle des hypothèses cryptographiques, la sécurité dans ce modèle symbolique fournit desgaranties calculatoires.L'automatisation des preuves dans ce nouveau modèle (et l'élaboration d'axiomes suffisamment généraux pour prouver un grand nombre de protocoles) est une question laissée ouverte par l'article de Bana et Comon. Dans cette thèse nous proposons une procédure de décision efficace pour une large classe d'axiomes. De plus nous avons implémenté cette procédure dans un outil (SCARY). Nos résultats expérimentaux montrent que nos axiomes modélisant la sécurité du chiffrement sont suffisamment généraux pour prouver une large classe de protocoles

    Machine-Checked Formalisation and Verification of Cryptographic Protocols

    Get PDF
    PhD ThesisAiming for strong security assurance, researchers in academia and industry focus their interest on formal verification of cryptographic constructions. Automatising formal verification has proved itself to be a very difficult task, where the main challenge is to support generic constructions and theorems, and to carry out the mathematical proofs. This work focuses on machine-checked formalisation and automatic verification of cryptographic protocols. One aspect we covered is the novel support for generic schemes and real-world constructions among old and novel protocols: key exchange schemes (Simple Password Exponential Key Exchange, SPEKE), commitment schemes (with the popular Pedersen scheme), sigma protocols (with the Schnorr’s zero-knowledge proof of knowledge protocol), and searchable encryption protocols (Sophos). We also investigated aspects related to the reasoning of simulation based proofs, where indistinguishability of two different algorithms by any adversary is the crucial point to prove privacy-related properties. We embedded information-flow techniques into the EasyCrypt core language, then we show that our effort not only makes some proofs easier and (sometimes) fewer, but is also more powerful than other existing techniques in particular situations

    Data Privacy and Trust in Cloud Computing

    Get PDF
    This open access book brings together perspectives from multiple disciplines including psychology, law, IS, and computer science on data privacy and trust in the cloud. Cloud technology has fueled rapid, dramatic technological change, enabling a level of connectivity that has never been seen before in human history. However, this brave new world comes with problems. Several high-profile cases over the last few years have demonstrated cloud computing's uneasy relationship with data security and trust. This volume explores the numerous technological, process and regulatory solutions presented in academic literature as mechanisms for building trust in the cloud, including GDPR in Europe. The massive acceleration of digital adoption resulting from the COVID-19 pandemic is introducing new and significant security and privacy threats and concerns. Against this backdrop, this book provides a timely reference and organising framework for considering how we will assure privacy and build trust in such a hyper-connected digitally dependent world. This book presents a framework for assurance and accountability in the cloud and reviews the literature on trust, data privacy and protection, and ethics in cloud computing

    Automated Analysis in Generic Groups

    Get PDF
    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups - and prove \u27\u27computational soundness\u27\u27 theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating \u27\u27pairing-product equations\u27\u27. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an automated tool to search for structure-preserving signatures matching a template. Through exhaustive search we conjecture lower bounds for the number of pairings required in the Type~II setting and prove our conjecture to be true. Finally, our tool exhibits examples of structure-preserving signatures matching the lower bounds, which proves tightness of our bounds, as well as improves on previously known structure-preserving signature schemes

    Tidy: Symbolic Verification of Timed Cryptographic Protocols

    Get PDF
    International audienc
    corecore