34 research outputs found

    Discrete Logarithm Cryptography

    Get PDF
    The security of many cryptographic schemes relies on the intractability of the discrete logarithm problem (DLP) in groups. The most commonly used groups to deploy such schemes are the multiplicative (sub)groups of finite fields and (hyper)elliptic curve groups over finite fields. The elements of these groups can be easily represented in a computer and the group arithmetic can be efficiently implemented. In this thesis we first study certain subgroups of characteristic-two and characteristic-three finite field groups, with the goal of obtaining more efficient representation of elements and more efficient arithmetic in the corresponding groups. In particular, we propose new compression techniques and exponentiation algorithms, and discuss some potential benefits and applications. Having mentioned that intractability of DLP is a basis for building cryptographic protocols, one should also take into consideration how a system is implemented. It has been shown that realistic (validation) attacks can be mounted against elliptic curve cryptosystems in the case that group membership testing is omitted. In the second part of the thesis, we extend the notion of validation attacks from elliptic curves to hyperelliptic curves, and show that singular curves can be used effectively in such attacks. Finally, we tackle a specific location-privacy problem called the nearby friend problem. We formalize the security model and then propose a new protocol and its extensions that solve the problem in the proposed security model. An interesting feature of the protocol is that it does not depend on any cryptographic primitive and its security is primarily based on the intractability of the DLP. Our solution provides a new approach to solve the nearby friend problem and compares favorably with the earlier solutions to this problem

    Key establishment --- security models, protocols and usage

    Get PDF
    Key establishment is the process whereby two or more parties derive a shared secret, typically used for subsequent confidential communication. However, identifying the exact security requirements for key establishment protocols is a non-trivial task. This thesis compares, extends and merges existing security definitions and models for key establishment protocols. The primary focus is on two-party key agreement schemes in the public-key setting. On one hand new protocols are proposed and analyzed in the existing Canetti-Krawzcyk model. On the other hand the thesis develops a security model and novel definition that capture the essential security attributes of the standardized Unified Model key agreement protocol. These analyses lead to the development of a new security model and related definitions that combine and extend the Canetti-Krawzcyk pre- and post- specified peer models in terms of provided security assurances. The thesis also provides a complete analysis of a one-pass key establishment scheme. There are security goals that no one-pass key establishment scheme can achieve, and hence the two-pass security models and definitions need to be adapted for one-pass protocols. The analysis provided here includes the description of the required modification to the underlying security model. Finally, a complete security argument meeting these altered conditions is presented as evidence supporting the security of the one-pass scheme. Lastly, validation and reusing short lived key pairs are related to efficiency, which is a major objective in practice. The thesis considers the formal implication of omitting validation steps and reusing short lived key pairs. The conclusions reached support the generally accepted cryptographic conventions that incoming messages should not be blindly trusted and extra care should be taken when key pairs are reused

    Decryption Failure Attacks on Post-Quantum Cryptography

    Get PDF
    This dissertation discusses mainly new cryptanalytical results related to issues of securely implementing the next generation of asymmetric cryptography, or Public-Key Cryptography (PKC).PKC, as it has been deployed until today, depends heavily on the integer factorization and the discrete logarithm problems.Unfortunately, it has been well-known since the mid-90s, that these mathematical problems can be solved due to Peter Shor's algorithm for quantum computers, which achieves the answers in polynomial time.The recently accelerated pace of R&D towards quantum computers, eventually of sufficient size and power to threaten cryptography, has led the crypto research community towards a major shift of focus.A project towards standardization of Post-quantum Cryptography (PQC) was launched by the US-based standardization organization, NIST. PQC is the name given to algorithms designed for running on classical hardware/software whilst being resistant to attacks from quantum computers.PQC is well suited for replacing the current asymmetric schemes.A primary motivation for the project is to guide publicly available research toward the singular goal of finding weaknesses in the proposed next generation of PKC.For public key encryption (PKE) or digital signature (DS) schemes to be considered secure they must be shown to rely heavily on well-known mathematical problems with theoretical proofs of security under established models, such as indistinguishability under chosen ciphertext attack (IND-CCA).Also, they must withstand serious attack attempts by well-renowned cryptographers both concerning theoretical security and the actual software/hardware instantiations.It is well-known that security models, such as IND-CCA, are not designed to capture the intricacies of inner-state leakages.Such leakages are named side-channels, which is currently a major topic of interest in the NIST PQC project.This dissertation focuses on two things, in general:1) how does the low but non-zero probability of decryption failures affect the cryptanalysis of these new PQC candidates?And 2) how might side-channel vulnerabilities inadvertently be introduced when going from theory to the practice of software/hardware implementations?Of main concern are PQC algorithms based on lattice theory and coding theory.The primary contributions are the discovery of novel decryption failure side-channel attacks, improvements on existing attacks, an alternative implementation to a part of a PQC scheme, and some more theoretical cryptanalytical results

    Cryptographic Schemes based on Elliptic Curve Pairings

    Get PDF
    This thesis introduces the concept of certificateless public key cryptography (CLPKC). Elliptic curve pairings are then used to make concrete CL-PKC schemes and are also used to make other efficient key agreement protocols. CL-PKC can be viewed as a model for the use of public key cryptography that is intermediate between traditional certificated PKC and ID-PKC. This is because, in contrast to traditional public key cryptographic systems, CL-PKC does not require the use of certificates to guarantee the authenticity of public keys. It does rely on the use of a trusted authority (TA) who is in possession of a master key. In this respect, CL-PKC is similar to identity-based public key cryptography (ID-PKC). On the other hand, CL-PKC does not suffer from the key escrow property that is inherent in ID-PKC. Applications for the new infrastructure are discussed. We exemplify how CL-PKC schemes can be constructed by constructing several certificateless public key encryption schemes and modifying other existing ID based schemes. The lack of certificates and the desire to prove the schemes secure in the presence of an adversary who has access to the master key or has the ability to replace public keys, requires the careful development of new security models. We prove that some of our schemes are secure, provided that the Bilinear Diffie-Hellman Problem is hard. We then examine Joux’s protocol, which is a one round, tripartite key agreement protocol that is more bandwidth-efficient than any previous three-party key agreement protocol, however, Joux’s protocol is insecure, suffering from a simple man-in-the-middle attack. We show how to make Joux’s protocol secure, presenting several tripartite, authenticated key agreement protocols that still require only one round of communication. The security properties of the new protocols are studied. Applications for the protocols are also discussed

    Digital Light

    Get PDF
    Light symbolises the highest good, it enables all visual art, and today it lies at the heart of billion-dollar industries. The control of light forms the foundation of contemporary vision. Digital Light brings together artists, curators, technologists and media archaeologists to study the historical evolution of digital light-based technologies. Digital Light provides a critical account of the capacities and limitations of contemporary digital light-based technologies and techniques by tracing their genealogies and comparing them with their predecessor media. As digital light remediates multiple historical forms (photography, print, film, video, projection, paint), the collection draws from all of these histories, connecting them to the digital present and placing them in dialogue with one another. Light is at once universal and deeply historical. The invention of mechanical media (including photography and cinematography) allied with changing print technologies (half-tone, lithography) helped structure the emerging electronic media of television and video, which in turn shaped the bitmap processing and raster display of digital visual media. Digital light is, as Stephen Jones points out in his contribution, an oxymoron: light is photons, particulate and discrete, and therefore always digital. But photons are also waveforms, subject to manipulation in myriad ways. From Fourier transforms to chip design, colour management to the translation of vector graphics into arithmetic displays, light is constantly disciplined to human purposes. In the form of fibre optics, light is now the infrastructure of all our media; in urban plazas and handheld devices, screens have become ubiquitous, and also standardised. This collection addresses how this occurred, what it means, and how artists, curators and engineers confront and challenge the constraints of increasingly normalised digital visual media. While various art pieces and other content are considered throughout the collection, the focus is specifically on what such pieces suggest about the intersection of technique and technology. Including accounts by prominent artists and professionals, the collection emphasises the centrality of use and experimentation in the shaping of technological platforms. Indeed, a recurring theme is how techniques of previous media become technologies, inscribed in both digital software and hardware. Contributions include considerations of image-oriented software and file formats; screen technologies; projection and urban screen surfaces; histories of computer graphics, 2D and 3D image editing software, photography and cinematic art; and transformations of light-based art resulting from the distributed architectures of the internet and the logic of the database. Digital Light brings together high profile figures in diverse but increasingly convergent fields, from academy award-winner and co-founder of Pixar, Alvy Ray Smith to feminist philosopher Cathryn Vasseleu

    Coherent Light from Projection to Fibre Optics

    Get PDF

    Digital light

    Get PDF
    Light symbolises the highest good, it enables all visual art, and today it lies at the heart of billion-dollar industries. The control of light forms the foundation of contemporary vision. Digital Light brings together artists, curators, technologists and media archaeologists to study the historical evolution of digital light-based technologies. Digital Light provides a critical account of the capacities and limitations of contemporary digital light-based technologies and techniques by tracing their genealogies and comparing them with their predecessor media. As digital light remediates multiple historical forms (photography, print, film, video, projection, paint), the collection draws from all of these histories, connecting them to the digital present and placing them in dialogue with one another.Light is at once universal and deeply historical. The invention of mechanical media (including photography and cinematography) allied with changing print technologies (half-tone, lithography) helped structure the emerging electronic media of television and video, which in turn shaped the bitmap processing and raster display of digital visual media. Digital light is, as Stephen Jones points out in his contribution, an oxymoron: light is photons, particulate and discrete, and therefore always digital. But photons are also waveforms, subject to manipulation in myriad ways. From Fourier transforms to chip design, colour management to the translation of vector graphics into arithmetic displays, light is constantly disciplined to human purposes. In the form of fibre optics, light is now the infrastructure of all our media; in urban plazas and handheld devices, screens have become ubiquitous, and also standardised. This collection addresses how this occurred, what it means, and how artists, curators and engineers confront and challenge the constraints of increasingly normalised digital visual media.While various art pieces and other content are considered throughout the collection, the focus is specifically on what such pieces suggest about the intersection of technique and technology. Including accounts by prominent artists and professionals, the collection emphasises the centrality of use and experimentation in the shaping of technological platforms. Indeed, a recurring theme is how techniques of previous media become technologies, inscribed in both digital software and hardware. Contributions include considerations of image-oriented software and file formats; screen technologies; projection and urban screen surfaces; histories of computer graphics, 2D and 3D image editing software, photography and cinematic art; and transformations of light-based art resulting from the distributed architectures of the internet and the logic of the database.Digital Light brings together high profile figures in diverse but increasingly convergent fields, from academy award-winner and co-founder of Pixar, Alvy Ray Smith to feminist philosopher Cathryn Vasseleu

    Fabricate

    Get PDF
    Bringing together pioneers in design and making within architecture, construction, engineering, manufacturing, materials technology and computation, Fabricate is a triennial international conference, now in its third year (ICD, University of Stuttgart, April 2017). Each year it produces a supporting publication, to date the only one of its kind specialising in Digital Fabrication. The 2017 edition features 32 illustrated articles on built projects and works in progress from academia and practice, including contributions from leading practices such as Foster + Partners, Zaha Hadid Architects, Arup, and Ron Arad, and from world-renowned institutions including ICD Stuttgart, Harvard, Yale, MIT, Princeton University, The Bartlett School of Architecture (UCL) and the Architectural Association
    corecore