55 research outputs found

    Mixed radix design flow for security applications

    Get PDF
    The purpose of secure devices, such as smartcards, is to protect sensitive information against software and hardware attacks. Implementation of the appropriate protection techniques often implies non-standard methods that are not supported by the conventional design tools. In the recent decade the designers of secure devices have been working hard on customising the workflow. The presented research aims at collecting the up-to-date experiences in this area and create a generic approach to the secure design flow that can be used as guidance by engineers. Well-known countermeasures to hardware attacks imply the use of specific signal encodings. Therefore, multi-valued logic has been considered as a primary aspect of the secure design. The choice of radix is crucial for multi-valued logic synthesis. Practical examples reveal that it is not always possible to find the optimal radix when taking into account actual physical parameters of multi-valued operations. In other words, each radix has its advantages and disadvantages. Our proposal is to synthesise logic in different radices, so it could benefit from their combination. With respect to the design opportunities of the existing tools and the possibilities of developing new tools that would fill the gaps in the flow, two distinct design approaches have been formed: conversion driven design and pre-synthesis. The conversion driven design approach takes the outputs of mature and time-proven electronic design automation (EDA) synthesis tools to generate mixed radix datapath circuits in an endeavour to investigate the added relative advantages or disadvantages. An algorithm underpinning the approach is presented and formally described together with secure gate-level implementations. The obtained results are reported showing an increase in power consumption, thus giving further motivation for the second approach. The pre-synthesis approach is aimed at improving the efficiency by using multivalued logic synthesis techniques to produce an abstract component-level circuit before mapping it into technology libary. Reed-Muller expansions over Galois field arithmetic have been chosen as a theoretical foundation for this approach. In order to enable the combination of radices at the mathematical level, the multi-valued Reed-Muller expansions have been developed into mixed radix Reed-Muller expansions. The goals of the work is to estimate the potential of the new approach and to analyse its impact on circuit parameters down to the level of physical gates. The benchmark results show the approach extends the search space for optimisation and provides information on how the implemented functions are related to different radices. The theory of two-level radix models and corresponding computation methods are the primary theoretical contribution. It has been implemented in RMMixed tool and interfaced to the standard EDA tools to form a complete security-aware design flow.EThOS - Electronic Theses Online ServiceEPSRCGBUnited Kingdo

    Automatic generation of high speed elliptic curve cryptography code

    Get PDF
    Apparently, trust is a rare commodity when power, money or life itself are at stake. History is full of examples. Julius Caesar did not trust his generals, so that: ``If he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet, that not a word could be made out. If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.'' And so the history of cryptography began moving its first steps. Nowadays, encryption has decayed from being an emperor's prerogative and became a daily life operation. Cryptography is pervasive, ubiquitous and, the best of all, completely transparent to the unaware user. Each time we buy something on the Internet we use it. Each time we search something on Google we use it. Everything without (almost) realizing that it silently protects our privacy and our secrets. Encryption is a very interesting instrument in the "toolbox of security" because it has very few side effects, at least on the user side. A particularly important one is the intrinsic slow down that its use imposes in the communications. High speed cryptography is very important for the Internet, where busy servers proliferate. Being faster is a double advantage: more throughput and less server overhead. In this context, however, the public key algorithms starts with a big handicap. They have very bad performances if compared to their symmetric counterparts. Due to this reason their use is often reduced to the essential operations, most notably key exchanges and digital signatures. The high speed public key cryptography challenge is a very practical topic with serious repercussions in our technocentric world. Using weak algorithms with a reduced key length to increase the performances of a system can lead to catastrophic results. In 1985, Miller and Koblitz independently proposed to use the group of rational points of an elliptic curve over a finite field to create an asymmetric algorithm. Elliptic Curve Cryptography (ECC) is based on a problem known as the ECDLP (Elliptic Curve Discrete Logarithm Problem) and offers several advantages with respect to other more traditional encryption systems such as RSA and DSA. The main benefit is that it requires smaller keys to provide the same security level since breaking the ECDLP is much harder. In addition, a good ECC implementation can be very efficient both in time and memory consumption, thus being a good candidate for performing high speed public key cryptography. Moreover, some elliptic curve based techniques are known to be extremely resilient to quantum computing attacks, such as the SIDH (Supersingular Isogeny Diffie-Hellman). Traditional elliptic curve cryptography implementations are optimized by hand taking into account the mathematical properties of the underlying algebraic structures, the target machine architecture and the compiler facilities. This process is time consuming, requires a high degree of expertise and, ultimately, error prone. This dissertation' ultimate goal is to automatize the whole optimization process of cryptographic code, with a special focus on ECC. The framework presented in this thesis is able to produce high speed cryptographic code by automatically choosing the best algorithms and applying a number of code-improving techniques inspired by the compiler theory. Its central component is a flexible and powerful compiler able to translate an algorithm written in a high level language and produce a highly optimized C code for a particular algebraic structure and hardware platform. The system is generic enough to accommodate a wide array of number theory related algorithms, however this document focuses only on optimizing primitives based on elliptic curves defined over binary fields
    • …
    corecore