148 research outputs found
Étude statistique de l'activité de la fonction de sélection dans l'algorithme de E-méthode
Ce travail porte sur l'étude statistique de l'activité liée à la fonction de sélection dans l'algorithme d'approximation de polynômes connu sous le nom de E-méthode proposé par M. Ercegovac dans \cite{Erc75:phd,Erc77}. La latitude de choix dans la fonction de sélection des chiffres du résultat, en représentation redondante, permet d'envisager de limiter l'activité électrique dans certains cas. Cet article présente un début d'étude sur les gains envisageables dans ce cadre
Optimisation d'opérateurs arithmétiques matériels à base d'approximations polynomiales
National audienceCet article présente une méthode pour l'optimisation d'opérateurs arithmétiques matériels dédiés à l'évaluation de fonctions par des polynômes. La méthode, basée sur des outils récents, réduit la taille des coefficients et des valeurs intermédiaires tout en bornant l'erreur totale (approximation et évaluation). Elle conduit à des opérateurs petits et rapides tout en garantissant une bonne qualité numérique. La méthode est illustrée sur quelques exemples en FPGA
Security Evaluations Beyond Computing Power: How to Analyze Side-Channel Attacks you Cannot Mount?
Present key sizes for symmetric cryptography are usually required to be at least 80-bit long for short-term protection, and 128-bit long for long-term protection. However, current tools for security evaluations against side-channel attacks do not provide a precise estimation of the remaining key strength after some leakage has been observed, e.g. in terms of number of candidates to test. This leads to an uncomfortable situation, where the security of an implementation can be anywhere between enumerable values (i.e. -- key candidates to test) and the full key size (i.e. -- key candidates to test). In this paper, we mitigate this important issue, and describe a key rank estimation algorithm that provides tight bounds for the security level of leaking cryptographic devices. As a result and for the first time, we are able to analyze the full complexity of “standard” (i.e. divide-and-conquer) side-channel attacks, in terms of their tradeoff between time, data and memory complexity
Improving the Rules of the DPA Contest
A DPA contest has been launched at CHES 2008. The goal of this initiative is to make it possible for researchers to compare different side-channel attacks in an objective manner. For this purpose, a set of 80000 traces corresponding to the encryption of 80000 different plaintexts with the Data Encryption Standard and a fixed key has been made available. In this short note, we discuss the rules that the contest uses to rate the effectiveness of different distinguishers. We first describe practical examples of attacks in which these rules can be misleading. Then, we suggest an improved set of rules that can be implemented easily in order to obtain a better interpretation of the comparisons performed
Efficient Entropy Estimation for Mutual Information Analysis Using B-Splines
International audienceThe Correlation Power Analysis (CPA) is probably the most used side-channel attack because it seems to fit the power model of most standard CMOS devices and is very efficiently computed. However, the Pearson correlation coefficient used in the CPA measures only linear statistical dependences where the Mutual Information (MI) takes into account both linear and nonlinear dependences. Even if there can be simultaneously large correlation coefficients quantified by the correlation coefficient and weak dependences quantified by the MI, we can expect to get a more profound understanding about interactions from an MI Analysis (MIA). We study methods that improve the non-parametric Probability Density Functions (PDF) in the estimation of the entropies and, in particular, the use of B-spline basis functions as pdf estimators. Our results indicate an improvement of two fold in the number of required samples compared to a classic MI estimation. The B-spline smoothing technique can also be applied to the rencently introduced Cramér-von-Mises test
Cryptanalysis of the CHES 2009/2010 Random Delay Countermeasure
Inserting random delays in cryptographic implementations is often used as a countermeasure against side-channel attacks. Most previous works on the topic focus on improving the statistical distribution of these delays. For example, efficient random delay generation algorithms have been proposed at CHES 2009/2010. These solutions increase security against attacks that solve the lack of synchronization between different leakage traces by integrating them. In this paper, we demonstrate that integration may not be the best tool to evaluate random delay insertions. For this purpose, we first describe different attacks exploiting pattern recognition techniques and Hidden Markov Models. Using these tools, we succeed in cryptanalyzing a (straightforward) implementation of the CHES 2009/2010 proposal in an Atmel microcontroller, with the same data complexity as an unprotected implementation of the AES Rijndael. In other words, we completely cancel the countermeasure in this case. Next, we show that our cryptanalysis tools are remarkably robust to attack improved variants of the countermeasure, e.g. with additional noise or irregular dummy operations. We also exhibit that the attacks remain applicable in a non-profiled adversarial scenario. Overall, these results suggest that the use of random delays may not be effective for protecting small embedded devices against side-channel leakage. They also confirm the need of worst-case analysis in physical security evaluations
Back to Massey: Impressively fast, scalable and tight security evaluation tools
None of the existing rank estimation algorithms can scale to large cryptographic
keys, such as 4096-bit (512 bytes) RSA keys. In this paper, we present the first
solution to estimate the guessing entropy of arbitrarily large keys, based on
mathematical bounds, resulting in the fastest and most scalable security
evaluation tool to date. Our bounds can be computed within a fraction of a second, with
no memory overhead, and provide a margin of only a few bits for a full 128-bit
AES key
Poly-Logarithmic Side Channel Rank Estimation via Exponential Sampling
Rank estimation is an important tool for a side-channel evaluations laboratories. It allows estimating the remaining security after an attack has been performed, quantified as the time complexity and the memory consumption required to brute force the key given the leakages as probability distributions over subkeys (usually key bytes). These estimations are particularly useful where the key is not reachable with exhaustive search.
We propose ESrank, the first rank estimation algorithm that enjoys provable poly-logarithmic time- and space-complexity, which also achieves excellent practical performance. Our main idea is to use exponential sampling to drastically reduce the algorithm\u27s complexity. Importantly, ESrank is simple to build from scratch, and requires no algorithmic tools beyond a sorting function. After rigorously bounding the accuracy, time and space complexities, we evaluated the performance of ESrank on a real SCA data corpus, and compared it to the currently-best histogram-based algorithm. We show that ESrank gives excellent rank estimation (with roughly a 1-bit margin between lower and upper bounds), with a performance that is on-par with the Histogram algorithm: a run-time of under 1 second on a standard laptop using 6.5 MB RAM
ActuaciĂłn en zonas antiguas de pueblos y ciudades
ActuaciĂłn en zonas antiguas de pueblos y ciudade
- …