90 research outputs found

    Jesuit Accounts of Chinese History and Chronology and their Chinese Sources

    Get PDF
    When Jesuit missionaries went to China in the seventeenth century, they discovered that Chinese history was in many regards apparently longer than the history as presented by the Bible. Subsequently, they started to translate Chinese histories, which they sent back to Europe, and which in the eighteenth century were adopted by Enlightenment thinkers for their own purposes. The European side of this story is quite well known, but what about the Chinese side? What sources did the Jesuits use and how did these sources interpret ancient history?As part of a larger project, these questions about the Chinese sources are answered from an intercultural perspective. The missionaries not only used classical Chinese histories written during the Song dynasty (960-1279), but also numerous newly edited or newly composed works from the seventeenth century. While they themselves originated from a Europe in which the ars historica was in full transition, they met a situation in China where new approaches to history had emerged. They used comprehensive histories, such as the one by the late Ming scholar Nan Xuan 南軒, or the more wide-spread genres, such as gangjian 綱鑑 (outline and mirror) histories, which from the late eighteenth century fell into oblivion. In fact, the sources used by the Jesuits not only throw light on their own compilations that were ultimately sent to Europe, but also on the writing of history in China in the late Ming (1368-1644) and the early Qing dynasties (1644-1911)

    Security Evaluations Beyond Computing Power: How to Analyze Side-Channel Attacks you Cannot Mount?

    Get PDF
    Present key sizes for symmetric cryptography are usually required to be at least 80-bit long for short-term protection, and 128-bit long for long-term protection. However, current tools for security evaluations against side-channel attacks do not provide a precise estimation of the remaining key strength after some leakage has been observed, e.g. in terms of number of candidates to test. This leads to an uncomfortable situation, where the security of an implementation can be anywhere between enumerable values (i.e. 2402^{40} -- 2502^{50} key candidates to test) and the full key size (i.e. 2802^{80} -- 21282^{128} key candidates to test). In this paper, we mitigate this important issue, and describe a key rank estimation algorithm that provides tight bounds for the security level of leaking cryptographic devices. As a result and for the first time, we are able to analyze the full complexity of “standard” (i.e. divide-and-conquer) side-channel attacks, in terms of their tradeoff between time, data and memory complexity

    Improving the Rules of the DPA Contest

    Get PDF
    A DPA contest has been launched at CHES 2008. The goal of this initiative is to make it possible for researchers to compare different side-channel attacks in an objective manner. For this purpose, a set of 80000 traces corresponding to the encryption of 80000 different plaintexts with the Data Encryption Standard and a fixed key has been made available. In this short note, we discuss the rules that the contest uses to rate the effectiveness of different distinguishers. We first describe practical examples of attacks in which these rules can be misleading. Then, we suggest an improved set of rules that can be implemented easily in order to obtain a better interpretation of the comparisons performed

    Cryptanalysis of the CHES 2009/2010 Random Delay Countermeasure

    Get PDF
    Inserting random delays in cryptographic implementations is often used as a countermeasure against side-channel attacks. Most previous works on the topic focus on improving the statistical distribution of these delays. For example, efficient random delay generation algorithms have been proposed at CHES 2009/2010. These solutions increase security against attacks that solve the lack of synchronization between different leakage traces by integrating them. In this paper, we demonstrate that integration may not be the best tool to evaluate random delay insertions. For this purpose, we first describe different attacks exploiting pattern recognition techniques and Hidden Markov Models. Using these tools, we succeed in cryptanalyzing a (straightforward) implementation of the CHES 2009/2010 proposal in an Atmel microcontroller, with the same data complexity as an unprotected implementation of the AES Rijndael. In other words, we completely cancel the countermeasure in this case. Next, we show that our cryptanalysis tools are remarkably robust to attack improved variants of the countermeasure, e.g. with additional noise or irregular dummy operations. We also exhibit that the attacks remain applicable in a non-profiled adversarial scenario. Overall, these results suggest that the use of random delays may not be effective for protecting small embedded devices against side-channel leakage. They also confirm the need of worst-case analysis in physical security evaluations

    From Improved Leakage Detection to the Detection of Points of Interests in Leakage Traces

    Get PDF
    Leakage detection usually refers to the task of identifying data-dependent information in side-channel measurements, independent of whether this information can be exploited. Detecting Points-Of-Interest (POIs) in leakage traces is a complementary task that is a necessary first step in most side-channel attacks, where the adversary wants to turn this information into (e.g.) a key recovery. In this paper, we discuss the differences between these tasks, by investigating a popular solution to leakage detection based on a t-test, and an alternative method exploiting Pearson\u27s correlation coefficient. We first show that the simpler t-test has better sampling complexity, and that its gain over the correlation-based test can be predicted by looking at the Signal-to-Noise Ratio (SNR) of the leakage partitions used in these tests. This implies that the sampling complexity of both tests relates more to their implicit leakage assumptions than to the actual statistics exploited. We also put forward that this gain comes at the cost of some intuition loss regarding the localization of the exploitable leakage samples in the traces, and their informativeness. Next, and more importantly, we highlight that our reasoning based on the SNR allows defining an improved t-test with significantly faster detection speed (with approximately 5 times less measurements in our experiments), which is therefore highly relevant for evaluation laboratories. We finally conclude that whereas t-tests are the method of choice for leakage detection only, correlation-based tests exploiting larger partitions are preferable for detecting POIs. We confirm this intuition by improving automated tools for the detection of POIs in the leakage measurements of a masked implementation, in a black box manner and without key knowledge, thanks to a correlation-based leakage detection test

    Le rôle du vice-provencial, Antoine Thomas, dans la querelle des rites chinois durant les années 1701-1704

    No full text
    status: publishe

    Attention Ă  l'Ă©cart: l'art de l'entre-deux

    No full text
    status: publishe
    • …
    corecore