121 research outputs found

    Examining the practical side channel resilience of arx-boxes

    Get PDF
    Implementations of ARX ciphers are hoped to have some intrinsic side channel resilience owing to the specific choice of cipher components: modular addition (A), rotation (R) and exclusive-or (X). Previous work has contributed to this understanding by developing theory regarding the side channel resilience of components (pioneered by the early works of Prouff) as well as some more recent practical investigations by Biryukov et al. that focused on lightweight cipher constructions. We add to this work by specifically studying ARX-boxes both mathematically as well as practically. Our results show that previous works\u27 reliance on the simplistic assumption that intermediates independently leak (their Hamming weight) has led to the incorrect conclusion that the modular addition is necessarily the best target and that ARX constructions are therefore harder to attack in practice: we show that on an ARM M0, the best practical target is the exclusive or and attacks succeed with only tens of traces

    Side Channel Attacks on IoT Applications

    Get PDF

    An Analytic Attack Against ARX Addition Exploiting Standard Side-Channel Leakage

    Get PDF
    In the last few years a new design paradigm, the so-called ARX (modular addition, rotation, exclusive-or) ciphers, have gained popularity in part because of their non-linear operation\u27s seemingly `inherent resilience\u27 against Differential Power Analysis (DPA) Attacks: the non-linear modular addition is not only known to be a poor target for DPA attacks, but also the computational complexity of DPA-style attacks grows exponentially with the operand size and thus DPA-style attacks quickly become practically infeasible. We however propose a novel DPA-style attack strategy that scales linearly with respect to the operand size in the chosen-message attack setting

    Towards Automated Detection of Single-Trace Side-Channel Vulnerabilities in Constant-Time Cryptographic Code

    Full text link
    Although cryptographic algorithms may be mathematically secure, it is often possible to leak secret information from the implementation of the algorithms. Timing and power side-channel vulnerabilities are some of the most widely considered threats to cryptographic algorithm implementations. Timing vulnerabilities may be easier to detect and exploit, and all high-quality cryptographic code today should be written in constant-time style. However, this does not prevent power side-channels from existing. With constant time code, potential attackers can resort to power side-channel attacks to try leaking secrets. Detecting potential power side-channel vulnerabilities is a tedious task, as it requires analyzing code at the assembly level and needs reasoning about which instructions could be leaking information based on their operands and their values. To help make the process of detecting potential power side-channel vulnerabilities easier for cryptographers, this work presents Pascal: Power Analysis Side Channel Attack Locator, a tool that introduces novel symbolic register analysis techniques for binary analysis of constant-time cryptographic algorithms, and verifies locations of potential power side-channel vulnerabilities with high precision. Pascal is evaluated on a number of implementations of post-quantum cryptographic algorithms, and it is able to find dozens of previously reported single-trace power side-channel vulnerabilities in these algorithms, all in an automated manner

    Security of Ubiquitous Computing Systems

    Get PDF
    The chapters in this open access book arise out of the EU Cost Action project Cryptacus, the objective of which was to improve and adapt existent cryptanalysis methodologies and tools to the ubiquitous computing framework. The cryptanalysis implemented lies along four axes: cryptographic models, cryptanalysis of building blocks, hardware and software security engineering, and security assessment of real-world systems. The authors are top-class researchers in security and cryptography, and the contributions are of value to researchers and practitioners in these domains. This book is open access under a CC BY license

    Security of Ubiquitous Computing Systems

    Get PDF
    The chapters in this open access book arise out of the EU Cost Action project Cryptacus, the objective of which was to improve and adapt existent cryptanalysis methodologies and tools to the ubiquitous computing framework. The cryptanalysis implemented lies along four axes: cryptographic models, cryptanalysis of building blocks, hardware and software security engineering, and security assessment of real-world systems. The authors are top-class researchers in security and cryptography, and the contributions are of value to researchers and practitioners in these domains. This book is open access under a CC BY license

    Water Use Patterns of a Riparian Forest in a Humid Subtropical Catchment in the Southeastern United States

    Get PDF
    The role of groundwater in sustaining plant transpiration has been studied for nearly a century. However, the body of literature investigating plant uptake of groundwater has largely been focused on arid and semiarid climates, with few examples from more humid locations. In this dissertation, I attempt to contribute a rigorous evaluation of groundwater transpiration (TG) from a humid riparian forest to fill in this knowledge gap. In chapter two, I explored the groundwater use by a riparian forest using techniques that exploit diurnal water table fluctuations from groundwater wells. Specifically, I investigated the spaciotemporal variability of TG using nine groundwater wells in a small headwater catchment in the Piedmont of Georgia. Results indicated a relatively high degree of variability, 3.30 ± 1.05 mm d-1 but lacking a consistent spatial pattern. Furthermore, groundwater derived transpiration was approximately 22 % of the average baseflow discharge over the growing season; indicating that even in humid regions plant transpiration can be a substantial component of the seasonal water budget. In chapter three, I incorporated an independent estimate of canopy transpiration (EC) to better constrain the estimates of TG from chapter two. This was motivated by wanting to partition the total water used by the riparian forest into groundwater and soil water sources. However, the results were not as anticipated. For the 2019 growing season TG was 455 mm, approximately twice as much as EC (241mm). This instead highlighted a methodological issue that had not been previously addressed. The formulae for estimating TG lacks a defined area of influence; however, in the present study I estimated this area of influence would have be 2 – 10 times larger than the delineated riparian zone for the fluxes to balance. In chapter four I explored the response of a riparian forest to a rapid onset flash drought. Results indicated that there was not watershed wide response of the forest canopy to the drought. However, at individual trees water use patterns did suggest a drought response, one that was dominated by an increase in reverse sap flow suggesting hydraulic redistribution was occurring to compensate for the excessively dry soils

    Performance-efficient cryptographic primitives in constrained devices

    Get PDF
    PhD ThesisResource-constrained devices are small, low-cost, usually fixed function and very limitedresource devices. They are constrained in terms of memory, computational capabilities, communication bandwidth and power. In the last decade, we have seen widespread use of these devices in health care, smart homes and cities, sensor networks, wearables, automotive systems, and other fields. Consequently, there has been an increase in the research activities in the security of these devices, especially in how to design and implement cryptography that meets the devices’ extreme resource constraints. Cryptographic primitives are low-level cryptographic algorithms used to construct security protocols that provide security, authenticity, and integrity of the messages. The building blocks of the primitives, which are built heavily on mathematical theories, are computationally complex and demands considerable computing resources. As a result, most of these primitives are either too large to fit on resource-constrained devices or highly inefficient when implemented on them. There have been many attempts to address this problem in the literature where cryptography engineers modify conventional primitives into lightweight versions or build new lightweight primitives from scratch. Unfortunately, both solutions suffer from either reduced security, low performance, or high implementation cost. This thesis investigates the performance of the conventional cryptographic primitives and explores the effect of their different building blocks and design choices on their performance. It also studies the impact of the various implementations approaches and optimisation techniques on their performance. Moreover, it investigates the limitations imposed by the tight processing and storage capabilities in constrained devices in implementing cryptography. Furthermore, it evaluates the performance of many newly designed lightweight cryptographic primitives and investigates the resources required to run them with acceptable performance. The thesis aims to provide an insight into the performance of the cryptographic primitives and the resource needed to run them with acceptable performance. This will help in providing solutions that balance performance, security, and resource requirements for these devices.The Institute of Public Administration in Riyadh, and the Saudi Arabian Cultural Bureau in Londo

    Developing models for the data-based mechanistic approach to systems analysis:Increasing objectivity and reducing assumptions

    Get PDF
    Stochastic State-Space Time-Varying Random Walk models have been developed, allowing the existing Stochastic State Space models to operate directly on irregularly sampled time-series. These TVRW models have been successfully applied to two different classes of models benefiting each class in different ways. The first class of models - State Dependent Parameter (SDP) models and used to investigate the dominant dynamic modes of nonlinear dynamic systems and the non-linearities in these models affected by arbitrary State Variables. In SDP locally linearised models it is assumed that the parameters that describe system’s behaviour changes are dependent upon some aspect of the system (it’s ‘state’). Each parameter can be dependent on one or more states. To estimate the parameters that are changing at a rate related to that of it’s states, the estimation procedure is conducted in the state-space along the potentially multivariate trajectory of the states which drive the parameters. The introduction of the newly developed TVRW models significantly improves parameter estimation, particularly in data rich neighbourhoods of the state-space when the parameter is dependent on more than one state, and the ends of the data-series when the parameter is dependent on one state with few data points. The second class of models are known as Dynamic Harmonic Regression (DHR) models and are used to identify the dominant cycles and trends of time-series. DHR models the assumption is that a signal (such as a time-series) can be broken down into four (unobserved) components occupying different parts of the spectrum: trend, seasonal cycle, other cycles, and a high frequency irregular component. DHR is confined to uniformly sampled time-series. The introduction of the TVRW models allows DHR to operate on irregularly sampled time-series, with the added benefit of forecasting origin no longer being confined to starting at the end of the time-series but can now begin at any point in the future. Additionally, the forecasting sampling rate is no longer limited to the sampling rate of the time-series. Importantly, both classes of model were designed to follow the Data-Based Mechanistic (DBM) approach to modelling environmental systems, where the model structure and parameters are to be determined by the data (Data-Based) and then the subsequent models are to be validated based on their physical interpretation (Mechanistic). The aim is to remove the researcher’s preconceptions from model development in order to eliminate any bias, and then use the researcher’s knowledge to validate the models presented to them. Both classes of model lacked model structure identification procedures and so model structure was determined by the researcher, against the DBM approach. Two different model structure identification procedures, one for SDP and the other for DHR, were developed to bring both classes of models back within the DBM framework. These developments have been presented and tested here on both simulated data and real environmental data, demonstrating their importance, benefits and role in environmental modelling and exploratory data analysis
    • …
    corecore