13 research outputs found

    Soft Linear Logic and Polynomial Complexity Classes

    Get PDF
    AbstractWe describe some results inspired to Lafont's Soft Linear Logic (SLL) which is a subsystem of second-order linear logic with restricted rules for exponentials, correct and complete for polynomial time computations. SLL is the basis for the design of type assignment systems for lambda-calculus, characterizing the complexity classes PTIME, PSPACE and NPTIME. PTIME is characterized by a type assignments system where types are a proper subset of SLL formulae. The characterization consists in the fact that a well typed term can be reduced to normal form by a number of beta-reductions polynomial in its lenght, and moreover all polynomial time functions can be computed by well typed terms. PSPACE is characterized by a type assignment system obtained from the previous one, by extending the set of types by a type for booleans, and the lambda-calculus by two boolean constants and a conditional constructor. The system assigns types to terms in such a way that the evaluation of programs (closed terms of type boolean) can be performed carefully in polynomial space. Moreover all polynomial space decision problems can be computed by terms typable in this system. In order to characterize NPTIME we extend the lambda-calculus by a nondeterministic choice operator, and the system by a rule for dealing with this new term constructor

    Control structures in programs and computational complexity

    Get PDF
    This thesis is concerned with analysing the impact of nesting (restricted) control structures in programs, such as primitive recursion or loop statements, on the running time or computational complexity. The method obtained gives insight as to why some nesting of control structures may cause a blow up in computational complexity, while others do not. The method is demonstrated for three types of programming languages..

    The Bang Calculus Revisited

    Full text link
    Call-by-Push-Value (CBPV) is a programming paradigm subsuming both Call-by-Name (CBN) and Call-by-Value (CBV) semantics. The paradigm was recently modelled by means of the Bang Calculus, a term language connecting CBPV and Linear Logic. This paper presents a revisited version of the Bang Calculus, called λ!\lambda !, enjoying some important properties missing in the original system. Indeed, the new calculus integrates commutative conversions to unblock value redexes while being confluent at the same time. A second contribution is related to non-idempotent types. We provide a quantitative type system for our λ!\lambda !-calculus, and we show that the length of the (weak) reduction of a typed term to its normal form \emph{plus} the size of this normal form is bounded by the size of its type derivation. We also explore the properties of this type system with respect to CBN/CBV translations. We keep the original CBN translation from λ\lambda-calculus to the Bang Calculus, which preserves normal forms and is sound and complete with respect to the (quantitative) type system for CBN. However, in the case of CBV, we reformulate both the translation and the type system to restore two main properties: preservation of normal forms and completeness. Last but not least, the quantitative system is refined to a \emph{tight} one, which transforms the previous upper bound on the length of reduction to normal form plus its size into two independent \emph{exact} measures for them

    Automatic generation of high speed elliptic curve cryptography code

    Get PDF
    Apparently, trust is a rare commodity when power, money or life itself are at stake. History is full of examples. Julius Caesar did not trust his generals, so that: ``If he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet, that not a word could be made out. If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.'' And so the history of cryptography began moving its first steps. Nowadays, encryption has decayed from being an emperor's prerogative and became a daily life operation. Cryptography is pervasive, ubiquitous and, the best of all, completely transparent to the unaware user. Each time we buy something on the Internet we use it. Each time we search something on Google we use it. Everything without (almost) realizing that it silently protects our privacy and our secrets. Encryption is a very interesting instrument in the "toolbox of security" because it has very few side effects, at least on the user side. A particularly important one is the intrinsic slow down that its use imposes in the communications. High speed cryptography is very important for the Internet, where busy servers proliferate. Being faster is a double advantage: more throughput and less server overhead. In this context, however, the public key algorithms starts with a big handicap. They have very bad performances if compared to their symmetric counterparts. Due to this reason their use is often reduced to the essential operations, most notably key exchanges and digital signatures. The high speed public key cryptography challenge is a very practical topic with serious repercussions in our technocentric world. Using weak algorithms with a reduced key length to increase the performances of a system can lead to catastrophic results. In 1985, Miller and Koblitz independently proposed to use the group of rational points of an elliptic curve over a finite field to create an asymmetric algorithm. Elliptic Curve Cryptography (ECC) is based on a problem known as the ECDLP (Elliptic Curve Discrete Logarithm Problem) and offers several advantages with respect to other more traditional encryption systems such as RSA and DSA. The main benefit is that it requires smaller keys to provide the same security level since breaking the ECDLP is much harder. In addition, a good ECC implementation can be very efficient both in time and memory consumption, thus being a good candidate for performing high speed public key cryptography. Moreover, some elliptic curve based techniques are known to be extremely resilient to quantum computing attacks, such as the SIDH (Supersingular Isogeny Diffie-Hellman). Traditional elliptic curve cryptography implementations are optimized by hand taking into account the mathematical properties of the underlying algebraic structures, the target machine architecture and the compiler facilities. This process is time consuming, requires a high degree of expertise and, ultimately, error prone. This dissertation' ultimate goal is to automatize the whole optimization process of cryptographic code, with a special focus on ECC. The framework presented in this thesis is able to produce high speed cryptographic code by automatically choosing the best algorithms and applying a number of code-improving techniques inspired by the compiler theory. Its central component is a flexible and powerful compiler able to translate an algorithm written in a high level language and produce a highly optimized C code for a particular algebraic structure and hardware platform. The system is generic enough to accommodate a wide array of number theory related algorithms, however this document focuses only on optimizing primitives based on elliptic curves defined over binary fields
    corecore