71 research outputs found
Certifying Machine Code Safe from Hardware Aliasing: RISC is not necessarily risky
Sometimes machine code turns out to be a better target for verification than source code. RISC machine code is especially advantaged with respect to source code in this regard because it has only two instructions that access memory. That architecture forms the basis here for an inference system that can prove machine code safe against `hardware aliasing', an effect that occurs in embedded systems. There are programming memes that ensure code is safe from hardware aliasing, but we want to certify that a given machine code is provably safe
Empirical Patterns in Google Scholar Citation Counts
Scholarly impact may be metricized using an author's total number of citations as a stand-in for real worth, but this measure varies in applicability between disciplines. The detail of the number of citations per publication is nowadays mapped in much more detail on the Web, exposing certain empirical patterns. This paper explores those patterns, using the citation data from Google Scholar for a number of authors
An Open Question on the Uniqueness of (Encrypted) Arithmetic
We ask whether two or more images of arithmetic may inhabit the same space via different encodings. The answers have significance for a class of processor design that does all its computation in an encrypted form, without ever performing any decryption or encryption itself. Against the possibility of algebraic attacks against the arithmetic in a `crypto-processor' (KPU) we propose a defence called `ABC encryption' and show how this kind of encryption makes it impossible for observations of the arithmetic to be used by an attacker to discover the actual values. We also show how to construct such encrypted arithmetics
On the Security of Fully Homomorphic Encryption and Encrypted Computing: Is Division safe?
Since fully homomorphic encryption and homomorphically encrypted computing preserve algebraic identities such as 2*2=2+2, a natural question is whether this extremely utilitarian feature also sets up cryptographic attacks that use the encrypted arithmetic operators to generate or identify the encryptions of known constants. In particular, software or hardware might use encrypted addition and multiplication to do encrypted division and deliver the encryption of x/x=1. That can then be used to generate 1+1=2, etc, until a complete codebook is obtained. This paper shows that there is no formula or computation using 32-bit multiplication x*y and three-input addition x+y+z that yields a known constant from unknown inputs. We characterise what operations are similarly `safe' alone or in company, and show that 32-bit division is not safe in this sense, but there are trivial modifications that make it so
On obfuscating compilation for encrypted computing
Copyright © 2017 by SCITEPRESS - Science and Technology Publications, Lda. All rights reserved. This paper sets out conditions for privacy and security of data against the privileged operator on processors that 'work encrypted'. A compliant machine code architecture plus an 'obfuscating' compiler turns out to be both necessary and sufficient to achieve that, the combination mathematically assuring the privacy of user data in arbitrary computations in an encrypted computing context
The Secret Processor Will Go to the Ball: Benchmark Insider-Proof Encrypted Computing.
‘Encrypted computing’ is an approach to preventing
insider attacks by the privileged operator against the unprivileged user on a computing system. It requires a processor that works natively on encrypted data in user mode, and the security barrier that protects the user is hardware-based encryption, not access. We report on progress and practical experience with our superscalar RISC class prototype
processor for encrypted computing and supporting software
infrastructure. This paper aims to alert the secure hardware
community that encrypted computing is possibly practical, as
well as theoretically plausible. It has been shown formally
impossible for operator mode to read (or write to order) the
plaintext form of data originating from or being operated on
in the user mode of this class of processor, given that the
encryption is independently secure. Now we report standard
Dhrystone benchmarks for the prototype, showing performance
with AES-128 like a 433 MHz classic Pentium (1 GHz
base clock), thousands of times faster than other approache
A practical encrypted microprocessor
Copyright © 2016 by SCITEPRESS - Science and Technology Publications, Lda. All rights reserved.This paper explores a new approach to encrypted microprocessing, potentiating new trade-offs in security versus performance engineering. The coprocessor prototype described runs standard machine code (32-bit OpenRISC v1.1) with encrypted data in registers, on buses, and in memory. The architecture is 'superscalar', executing multiple instructions simultaneously, and is sophisticated enough that it achieves speeds approaching that of contemporary off-the-shelf processor cores. The aim of the design is to protect user data against the operator or owner of the processor, and so- called 'Iago' attacks in general, for those paradigms that require trust in data-heavy computations in remote locations and/or overseen by untrusted operators. A single idea underlies the architecture, its performance and security properties: it is that a modified arithmetic is enough to cause all program execution to be encrypted. The privileged operator, running unencrypted with the standard arithmetic, can see and try their luck at modifying encrypted data, but has no special access to the information in it, as proven here. We test the issues, reporting performance in particular for 64-bit Rijndael and 72-bit Paillier encryptions, the latter running keylessly
PEPtalk2: results of a pilot randomised controlled trial to compare VZIG and aciclovir as postexposure prophylaxis (PEP) against chickenpox in children with cancer.
OBJECTIVE: To determine the likely rate of patient randomisation and to facilitate sample size calculation for a full-scale phase III trial of varicella zoster immunoglobulin (VZIG) and aciclovir as postexposure prophylaxis against chickenpox in children with cancer. DESIGN: Multicentre pilot randomised controlled trial of VZIG and oral aciclovir. SETTING: England, UK. PATIENTS: Children under 16 years of age with a diagnosis of cancer: currently or within 6 months of receiving cancer treatment and with negative varicella zoster virus (VZV) serostatus at diagnosis or within the last 3 months. INTERVENTIONS: Study participants who have a significant VZV exposure were randomised to receive PEP in the form of VZIG or aciclovir after the exposure. MAIN OUTCOME MEASURES: Number of patients registered and randomised within 12 months of the trial opening to recruitment and incidence of breakthrough varicella. RESULTS: The study opened in six sites over a 13-month period. 482 patients were screened for eligibility, 32 patients were registered and 3 patients were randomised following VZV exposure. All three were randomised to receive aciclovir and there were no cases of breakthrough varicella. CONCLUSIONS: Given the limited recruitment to the PEPtalk2 pilot, it is unlikely that the necessary sample size would be achievable using this strategy in a full-scale trial. The study identified factors that could be used to modify the design of a definitive trial but other options for defining the best means to protect such children against VZV should be explored. TRIAL REGISTRATION NUMBER: ISRCTN48257441, EudraCT number: 2013-001332-22, sponsor: University of Birmingham
Body iron metabolism and pathophysiology of iron overload
Iron is an essential metal for the body, while excess iron accumulation causes organ dysfunction through the production of reactive oxygen species. There is a sophisticated balance of body iron metabolism of storage and transport, which is regulated by several factors including the newly identified peptide hepcidin. As there is no passive excretory mechanism of iron, iron is easily accumulated when exogenous iron is loaded by hereditary factors, repeated transfusions, and other diseased conditions. The free irons, non-transferrin-bound iron, and labile plasma iron in the circulation, and the labile iron pool within the cells, are responsible for iron toxicity. The characteristic features of advanced iron overload are failure of vital organs such as liver and heart in addition to endocrine dysfunctions. For the estimation of body iron, there are direct and indirect methods available. Serum ferritin is the most convenient and widely available modality, even though its specificity is sometimes problematic. Recently, new physical detection methods using magnetic resonance imaging and superconducting quantum interference devices have become available to estimate iron concentration in liver and myocardium. The widely used application of iron chelators with high compliance will resolve the problems of organ dysfunction by excess iron and improve patient outcomes
- …