56 research outputs found

    Some Theoretical Conditions for Menezes--Qu--Vanstone Key Agreement to Provide Implicit Key Authentication

    Get PDF
    Menezes--Qu--Vanstone key agreement (MQV) is intended to provide implicit key authentication (IKA) and several other security objectives. MQV is approved and specified in five standards. This report focuses on the IKA of two-pass MQV, without key confirmation. Arguably, implicit key authentication is the most essential security objective in authenticated key agreement. The report examines various necessary or sufficient formal conditions under which MQV may provide IKA. Incidentally, this report defines, relies on, and inter-relates various conditions on the key deriviation function and Diffie--Hellman groups. While it should be expected that most such definitions and results are already well-known, a reader interested in these topics may be interested in this report as a kind of review, even if they have no interest in MQV whatsoever

    ID-based tripartite Authenticated Key Agreement Protocols from pairings

    Get PDF
    This paper proposes ID-based tripartite authenticated key agreement protocols. The authenticated three party key agreement protocols from pairings [15], and the ID-based two party authenticated key agreement protocol [13] are studied. These two protocols are taken as the basis for designing three new ID-based tripartite authenticated key agreement protocols. The security properties of all these protocols are studied listing out the possible attacks on them. Further, these protocols are extended to provide key confirmation

    Analyzing the secure simple pairing in Bluetooth v4.0

    Get PDF
    This paper analyzes the security of Bluetooth v4.0’s Secure Simple Pairing (SSP) protocol, for both the Bluetooth Basic Rate / Enhanced Data Rate (BR/EDR) and Bluetooth Low Energy (LE) operational modes. Bluetooth v4.0 is the latest version of a wireless communication standard for low-speed and low-range data transfer among devices in a human’s PAN. It allows increased network mobility among devices such as headsets, PDAs, wireless keyboards and mice. A pairing process is initiated when two devices desire to communicate, and this pairing needs to correctly authenticate devices so that a secret link key is established for secure communication. What is interesting is that device authentication relies on humans to communicate verification information between devices via a human-aided out-of-band channel. Bluetooth v4.0’s SSP protocol is designed to offer security against passive eavesdropping and man-inthe- middle (MitM) attacks. We conduct the first known detailed analysis of SSP for all its MitM-secure models. We highlight some issues related to exchange of public keys and use of the passkey in its models and discuss how to treat them properly

    Desired Features and Design Methodologies of Secure Authenticated Key Exchange Protocols in the Public-Key Infrastructure Setting

    Get PDF
    The importance of an authenticated key exchange (AKE) protocol has long been known in the field of cryptography. Two of the questions still being asked today are (1) what properties or features does a secure AKE protocol possess, and (2) How does one, in a step by step fashion, create a secure AKE protocol? This thesis aims to answer these two questions. The thesis contains two parts: one is a survey of previous works on the desired features of the Station-to-Station (STS) protocol, and the other is a study of a previously proposed design methodology in designing secure AKE protocols, as well as contributing an original idea of such methodologies. Descriptions and comparisons of the two design methodologies are included. The thesis surveys the literature and conducts a case study of the STS protocol, analyzes various attacks on STS through some known attacks to it, and extracts the desired properties and features of a secure AKE protocol via the case study. This part of the thesis does not propose any new result, but summarizes a complete list of issues one should take consideration of while designing an AKE protocol. We also show that at the end of this part, a secure version of STS which possesses the desired features of an AKE protocol. The other major part of the thesis surveys one design methodology of creating a secure AKE protocol by Bellare, Canetti, and Krawczyk; it is based on having a secure key exchange protocol then adding (mutual) authentication to it. The thesis then proposes another original design methodology; it starts with a secure mutual authentication protocol, then adds the secure key exchange feature without modifying overheads and number of flows of the original mutual authentication protocol. We show in this part the "secure" AKE protocol developed through these two design approaches is identical to the secure version of STS described in the other part, and thus possesses the desired features of a secure AKE protocol. We also give a proof of security of the secure AKE protocol developed under our design methodology

    Automatic generation of high speed elliptic curve cryptography code

    Get PDF
    Apparently, trust is a rare commodity when power, money or life itself are at stake. History is full of examples. Julius Caesar did not trust his generals, so that: ``If he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet, that not a word could be made out. If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.'' And so the history of cryptography began moving its first steps. Nowadays, encryption has decayed from being an emperor's prerogative and became a daily life operation. Cryptography is pervasive, ubiquitous and, the best of all, completely transparent to the unaware user. Each time we buy something on the Internet we use it. Each time we search something on Google we use it. Everything without (almost) realizing that it silently protects our privacy and our secrets. Encryption is a very interesting instrument in the "toolbox of security" because it has very few side effects, at least on the user side. A particularly important one is the intrinsic slow down that its use imposes in the communications. High speed cryptography is very important for the Internet, where busy servers proliferate. Being faster is a double advantage: more throughput and less server overhead. In this context, however, the public key algorithms starts with a big handicap. They have very bad performances if compared to their symmetric counterparts. Due to this reason their use is often reduced to the essential operations, most notably key exchanges and digital signatures. The high speed public key cryptography challenge is a very practical topic with serious repercussions in our technocentric world. Using weak algorithms with a reduced key length to increase the performances of a system can lead to catastrophic results. In 1985, Miller and Koblitz independently proposed to use the group of rational points of an elliptic curve over a finite field to create an asymmetric algorithm. Elliptic Curve Cryptography (ECC) is based on a problem known as the ECDLP (Elliptic Curve Discrete Logarithm Problem) and offers several advantages with respect to other more traditional encryption systems such as RSA and DSA. The main benefit is that it requires smaller keys to provide the same security level since breaking the ECDLP is much harder. In addition, a good ECC implementation can be very efficient both in time and memory consumption, thus being a good candidate for performing high speed public key cryptography. Moreover, some elliptic curve based techniques are known to be extremely resilient to quantum computing attacks, such as the SIDH (Supersingular Isogeny Diffie-Hellman). Traditional elliptic curve cryptography implementations are optimized by hand taking into account the mathematical properties of the underlying algebraic structures, the target machine architecture and the compiler facilities. This process is time consuming, requires a high degree of expertise and, ultimately, error prone. This dissertation' ultimate goal is to automatize the whole optimization process of cryptographic code, with a special focus on ECC. The framework presented in this thesis is able to produce high speed cryptographic code by automatically choosing the best algorithms and applying a number of code-improving techniques inspired by the compiler theory. Its central component is a flexible and powerful compiler able to translate an algorithm written in a high level language and produce a highly optimized C code for a particular algebraic structure and hardware platform. The system is generic enough to accommodate a wide array of number theory related algorithms, however this document focuses only on optimizing primitives based on elliptic curves defined over binary fields

    Secure identity management in structured peer-to-peer (P2P) networks

    Get PDF
    Structured Peer-to-Peer (P2P) networks were proposed to solve routing problems of big distributed infrastructures. But the research community has been questioning their security for years. Most prior work in security services was focused on secure routing, reputation systems, anonymity, etc. However, the proper management of identities is an important prerequisite to provide most of these security services. The existence of anonymous nodes and the lack of a centralized authority capable of monitoring (and/or punishing) nodes make these systems more vulnerable against selfish or malicious behaviors. Moreover, these improper usages cannot be faced only with data confidentiality, nodes authentication, non-repudiation, etc. In particular, structured P2P networks should follow the following secure routing primitives: (1) secure maintenance of routing tables, (2) secure routing of messages, and (3) secure identity assignment to nodes. But the first two problems depend in some way on the third one. If nodes’ identifiers can be chosen by users without any control, these networks can have security and operational problems. Therefore, like any other network or service, structured P2P networks require a robust access control to prevent potential attackers joining the network and a robust identity assignment system to guarantee their proper operation. In this thesis, firstly, we analyze the operation of the current structured P2P networks when managing identities in order to identify what security problems are related to the nodes’ identifiers within the overlay, and propose a series of requirements to be accomplished by any generated node ID to provide more security to a DHT-based structured P2P network. Secondly, we propose the use of implicit certificates to provide more security and to exploit the improvement in bandwidth, storage and performance that these certificates present compared to explicit certificates, design three protocols to assign nodes’ identifiers avoiding the identified problems, while maintaining user anonymity and allowing users’ traceability. Finally, we analyze the operation of the most used mechanisms to distribute revocation data in the Internet, with special focus on the proposed systems to work in P2P networks, and design a new mechanism to distribute revocation data more efficiently in a structured P2P network.Las redes P2P estructuradas fueron propuestas para solventar problemas de enrutamiento en infraestructuras de grandes dimensiones pero su nivel de seguridad lleva años siendo cuestionado por la comunidad investigadora. La mayor parte de los trabajos que intentan mejorar la seguridad de estas redes se han centrado en proporcionar encaminamiento seguro, sistemas de reputación, anonimato de los usuarios, etc. Sin embargo, la adecuada gestión de las identidades es un requisito sumamente importante para proporcionar los servicios mencionados anteriormente. La existencia de nodos anónimos y la falta de una autoridad centralizada capaz de monitorizar (y/o penalizar) a los nodos hace que estos sistemas sean más vulnerables que otros a comportamientos maliciosos por parte de los usuarios. Además, esos comportamientos inadecuados no pueden ser detectados proporcionando únicamente confidencialidad de los datos, autenticación de los nodos, no repudio, etc. Las redes P2P estructuradas deberían seguir las siguientes primitivas de enrutamiento seguro: (1) mantenimiento seguro de las tablas de enrutamiento, (2) enrutamiento seguro de los mensajes, and (3) asignación segura de las identidades. Pero la primera de los dos primitivas depende de alguna forma de la tercera. Si las identidades de los nodos pueden ser elegidas por sus usuarios sin ningún tipo de control, muy probablemente aparecerán muchos problemas de funcionamiento y seguridad. Por lo tanto, de la misma forma que otras redes y servicios, las redes P2P estructuradas requieren de un control de acceso robusto para prevenir la presencia de atacantes potenciales, y un sistema robusto de asignación de identidades para garantizar su adecuado funcionamiento. En esta tesis, primero de todo analizamos el funcionamiento de las redes P2P estructuradas basadas en el uso de DHTs (Tablas de Hash Distribuidas), cómo gestionan las identidades de sus nodos, identificamos qué problemas de seguridad están relacionados con la identificación de los nodos y proponemos una serie de requisitos para generar identificadores de forma segura. Más adelante proponemos el uso de certificados implícitos para proporcionar más seguridad y explotar las mejoras en consumo de ancho de banda, almacenamiento y rendimiento que proporcionan estos certificados en comparación con los certificados explícitos. También hemos diseñado tres protocolos de asignación segura de identidades, los cuales evitan la mayor parte de los problemas identificados mientras mantienen el anonimato de los usuarios y la trazabilidad. Finalmente hemos analizado el funcionamiento de la mayoría de los mecanismos utilizados para distribuir datos de revocación en Internet, con especial interés en los sistemas propuestos para operar en redes P2P, y hemos diseñado un nuevo mecanismo para distribuir datos de revocación de forma más eficiente en redes P2P estructuradas.Postprint (published version

    Cryptographic Schemes based on Elliptic Curve Pairings

    Get PDF
    This thesis introduces the concept of certificateless public key cryptography (CLPKC). Elliptic curve pairings are then used to make concrete CL-PKC schemes and are also used to make other efficient key agreement protocols. CL-PKC can be viewed as a model for the use of public key cryptography that is intermediate between traditional certificated PKC and ID-PKC. This is because, in contrast to traditional public key cryptographic systems, CL-PKC does not require the use of certificates to guarantee the authenticity of public keys. It does rely on the use of a trusted authority (TA) who is in possession of a master key. In this respect, CL-PKC is similar to identity-based public key cryptography (ID-PKC). On the other hand, CL-PKC does not suffer from the key escrow property that is inherent in ID-PKC. Applications for the new infrastructure are discussed. We exemplify how CL-PKC schemes can be constructed by constructing several certificateless public key encryption schemes and modifying other existing ID based schemes. The lack of certificates and the desire to prove the schemes secure in the presence of an adversary who has access to the master key or has the ability to replace public keys, requires the careful development of new security models. We prove that some of our schemes are secure, provided that the Bilinear Diffie-Hellman Problem is hard. We then examine Joux’s protocol, which is a one round, tripartite key agreement protocol that is more bandwidth-efficient than any previous three-party key agreement protocol, however, Joux’s protocol is insecure, suffering from a simple man-in-the-middle attack. We show how to make Joux’s protocol secure, presenting several tripartite, authenticated key agreement protocols that still require only one round of communication. The security properties of the new protocols are studied. Applications for the protocols are also discussed

    Critical Perspectives on Provable Security: Fifteen Years of Another Look Papers

    Get PDF
    We give an overview of our critiques of “proofs” of security and a guide to our papers on the subject that have appeared over the past decade and a half. We also provide numerous additional examples and a few updates and errata

    On the Cryptographic Deniability of the Signal Protocol

    Full text link
    Offline deniability is the ability to a posteriori deny having participated in a particular communication session. This property has been widely assumed for the Signal messaging application, yet no formal proof has appeared in the literature. In this work, we present the first formal study of the offline deniability of the Signal protocol. Our analysis shows that building a deniability proof for Signal is non-trivial and requires strong assumptions on the underlying mathematical groups where the protocol is run. To do so, we study various implicitly authenticated key exchange protocols, including MQV, HMQV, and 3DH/X3DH, the latter being the core key agreement protocol in Signal. We first present examples of mathematical groups where running MQV results in a provably non-deniable interaction. While the concrete attack applies only to MQV, it also exemplifies the problems in attempting to prove the deniability of other implicitly authenticated protocols, such as 3DH. In particular, it shows that the intuition that the minimal transcript produced by these protocols suffices for ensuring deniability does not hold. We then provide a characterization of the groups where deniability holds, defined in terms of a knowledge assumption that extends the Knowledge of Exponent Assumption (KEA). We conclude our research by presenting additional results. First, we prove a general theorem that links the deniability of a communication session to the deniability of the key agreement protocol starting the session. This allows us to extend our results on the deniability of 3DH/X3DH to the entire Signal communication session. We show how our Knowledge of Diffie-Hellman Assumptions (KDH) knowledge assumption family can be used to establish a deniability proof for other implicitly authenticated Diffie-Hellman protocols, specifically the OAKE family \cite{Yao13}. By examining the deniability of the implicitly authenticated AKE protocols augmented with a confirmation step, we also demonstrate a counterintuitive result. Although such a modification requires protocol users to exchange additional information during the session, deniability may be established for these protocols under weaker assumptions (compared to the implicitly authenticated version). Lastly, we discussed our observations on various attack scenarios that undermine offline deniability with the assistance of third-party services and why these attacks should be put in a different category than offline deniability
    corecore