32,059 research outputs found
A Novel Latin Square Image Cipher
In this paper, we introduce a symmetric-key Latin square image cipher (LSIC)
for grayscale and color images. Our contributions to the image encryption
community include 1) we develop new Latin square image encryption primitives
including Latin Square Whitening, Latin Square S-box and Latin Square P-box ;
2) we provide a new way of integrating probabilistic encryption in image
encryption by embedding random noise in the least significant image bit-plane;
and 3) we construct LSIC with these Latin square image encryption primitives
all on one keyed Latin square in a new loom-like substitution-permutation
network. Consequently, the proposed LSIC achieve many desired properties of a
secure cipher including a large key space, high key sensitivities, uniformly
distributed ciphertext, excellent confusion and diffusion properties,
semantically secure, and robustness against channel noise. Theoretical analysis
show that the LSIC has good resistance to many attack models including
brute-force attacks, ciphertext-only attacks, known-plaintext attacks and
chosen-plaintext attacks. Experimental analysis under extensive simulation
results using the complete USC-SIPI Miscellaneous image dataset demonstrate
that LSIC outperforms or reach state of the art suggested by many peer
algorithms. All these analysis and results demonstrate that the LSIC is very
suitable for digital image encryption. Finally, we open source the LSIC MATLAB
code under webpage https://sites.google.com/site/tuftsyuewu/source-code.Comment: 26 pages, 17 figures, and 7 table
Discovering, quantifying, and displaying attacks
In the design of software and cyber-physical systems, security is often
perceived as a qualitative need, but can only be attained quantitatively.
Especially when distributed components are involved, it is hard to predict and
confront all possible attacks. A main challenge in the development of complex
systems is therefore to discover attacks, quantify them to comprehend their
likelihood, and communicate them to non-experts for facilitating the decision
process. To address this three-sided challenge we propose a protection analysis
over the Quality Calculus that (i) computes all the sets of data required by an
attacker to reach a given location in a system, (ii) determines the cheapest
set of such attacks for a given notion of cost, and (iii) derives an attack
tree that displays the attacks graphically. The protection analysis is first
developed in a qualitative setting, and then extended to quantitative settings
following an approach applicable to a great many contexts. The quantitative
formulation is implemented as an optimisation problem encoded into
Satisfiability Modulo Theories, allowing us to deal with complex cost
structures. The usefulness of the framework is demonstrated on a national-scale
authentication system, studied through a Java implementation of the framework.Comment: LMCS SPECIAL ISSUE FORTE 201
Quantum Noise Randomized Ciphers
We review the notion of a classical random cipher and its advantages. We
sharpen the usual description of random ciphers to a particular mathematical
characterization suggested by the salient feature responsible for their
increased security. We describe a concrete system known as AlphaEta and show
that it is equivalent to a random cipher in which the required randomization is
effected by coherent-state quantum noise. We describe the currently known
security features of AlphaEta and similar systems, including lower bounds on
the unicity distances against ciphertext-only and known-plaintext attacks. We
show how AlphaEta used in conjunction with any standard stream cipher such as
AES (Advanced Encryption Standard) provides an additional, qualitatively
different layer of security from physical encryption against known-plaintext
attacks on the key. We refute some claims in the literature that AlphaEta is
equivalent to a non-random stream cipher.Comment: Accepted for publication in Phys. Rev. A; Discussion augmented and
re-organized; Section 5 contains a detailed response to 'T. Nishioka, T.
Hasegawa, H. Ishizuka, K. Imafuku, H. Imai: Phys. Lett. A 327 (2004) 28-32
/quant-ph/0310168' & 'T. Nishioka, T. Hasegawa, H. Ishizuka, K. Imafuku, H.
Imai: Phys. Lett. A 346 (2005) 7
Genetic algorithms in cryptography
Genetic algorithms (GAs) are a class of optimization algorithms. GAs attempt to solve problems through modeling a simplified version of genetic processes. There are many problems for which a GA approach is useful. It is, however, undetermined if cryptanalysis is such a problem. Therefore, this work explores the use of GAs in cryptography. Both traditional cryptanalysis and GA-based methods are implemented in software. The results are then compared using the metrics of elapsed time and percentage of successful decryptions. A determination is made for each cipher under consideration as to the validity of the GA-based approaches found in the literature. In general, these GA-based approaches are typical of the field. Of the genetic algorithm attacks found in the literature, totaling twelve, seven were re-implemented. Of these seven, only three achieved any success. The successful attacks were those on the transposition and permutation ciphers by Matthews [20], Clark [4], and Griindlingh and Van Vuuren [13], respectively. These attacks were further investigated in an attempt to improve or extend their success. Unfortunately, this attempt was unsuccessful, as was the attempt to apply the Clark [4] attack to the monoalphabetic substitution cipher and achieve the same or indeed any level of success. Overall, the standard fitness equation genetic algorithm approach, and the scoreboard variant thereof, are not worth the extra effort involved. Traditional cryptanalysis methods are more successful, and easier to implement. While a traditional method takes more time, a faster unsuccessful attack is worthless. The failure of the genetic algorithm approach indicates that supplementary research into traditional cryptanalysis methods may be more useful and valuable than additional modification of GA-based approaches
A Reduced Semantics for Deciding Trace Equivalence
Many privacy-type properties of security protocols can be modelled using
trace equivalence properties in suitable process algebras. It has been shown
that such properties can be decided for interesting classes of finite processes
(i.e., without replication) by means of symbolic execution and constraint
solving. However, this does not suffice to obtain practical tools. Current
prototypes suffer from a classical combinatorial explosion problem caused by
the exploration of many interleavings in the behaviour of processes.
M\"odersheim et al. have tackled this problem for reachability properties using
partial order reduction techniques. We revisit their work, generalize it and
adapt it for equivalence checking. We obtain an optimisation in the form of a
reduced symbolic semantics that eliminates redundant interleavings on the fly.
The obtained partial order reduction technique has been integrated in a tool
called APTE. We conducted complete benchmarks showing dramatic improvements.Comment: Accepted for publication in LMC
- âŠ