1,384 research outputs found
On the Gold Standard for Security of Universal Steganography
While symmetric-key steganography is quite well understood both in the
information-theoretic and in the computational setting, many fundamental
questions about its public-key counterpart resist persistent attempts to solve
them. The computational model for public-key steganography was proposed by von
Ahn and Hopper in EUROCRYPT 2004. At TCC 2005, Backes and Cachin gave the first
universal public-key stegosystem - i.e. one that works on all channels -
achieving security against replayable chosen-covertext attacks (SS-RCCA) and
asked whether security against non-replayable chosen-covertext attacks (SS-CCA)
is achievable. Later, Hopper (ICALP 2005) provided such a stegosystem for every
efficiently sampleable channel, but did not achieve universality. He posed the
question whether universality and SS-CCA-security can be achieved
simultaneously. No progress on this question has been achieved since more than
a decade. In our work we solve Hopper's problem in a somehow complete manner:
As our main positive result we design an SS-CCA-secure stegosystem that works
for every memoryless channel. On the other hand, we prove that this result is
the best possible in the context of universal steganography. We provide a
family of 0-memoryless channels - where the already sent documents have only
marginal influence on the current distribution - and prove that no
SS-CCA-secure steganography for this family exists in the standard
non-look-ahead model.Comment: EUROCRYPT 2018, llncs styl
Recommended from our members
Are PCPs Inherent in Efficient Arguments?
Starting with Kilian (STOC ‘92), several works have shown how to use probabilistically checkable proofs (PCPs) and cryptographic primitives such as collision-resistant hashing to construct very efficient argument systems (a.k.a. computationally sound proofs), for example with polylogarithmic communication complexity. Ishai et al. (CCC ‘07) raised the question of whether PCPs are inherent in efficient arguments, and to what extent. We give evidence that they are, by showing how to convert any argument system whose soundness is reducible to the security of some cryptographic primitive into a PCP system whose efficiency is related to that of the argument system and the reduction (under certain complexity assumptions).Engineering and Applied Science
Simulating Auxiliary Inputs, Revisited
For any pair of correlated random variables we can think of as a
randomized function of . Provided that is short, one can make this
function computationally efficient by allowing it to be only approximately
correct. In folklore this problem is known as \emph{simulating auxiliary
inputs}. This idea of simulating auxiliary information turns out to be a
powerful tool in computer science, finding applications in complexity theory,
cryptography, pseudorandomness and zero-knowledge. In this paper we revisit
this problem, achieving the following results:
\begin{enumerate}[(a)] We discuss and compare efficiency of known results,
finding the flaw in the best known bound claimed in the TCC'14 paper "How to
Fake Auxiliary Inputs". We present a novel boosting algorithm for constructing
the simulator. Our technique essentially fixes the flaw. This boosting proof is
of independent interest, as it shows how to handle "negative mass" issues when
constructing probability measures in descent algorithms. Our bounds are much
better than bounds known so far. To make the simulator
-indistinguishable we need the complexity in time/circuit size, which is better by a
factor compared to previous bounds. In particular, with our
technique we (finally) get meaningful provable security for the EUROCRYPT'09
leakage-resilient stream cipher instantiated with a standard 256-bit block
cipher, like .Comment: Some typos present in the previous version have been correcte
Adversarially Robust Property-Preserving Hash Functions
Property-preserving hashing is a method of compressing a large input x into a short hash h(x) in such a way that given h(x) and h(y), one can compute a property P(x, y) of the original inputs. The idea of property-preserving hash functions underlies sketching, compressed sensing and locality-sensitive hashing.
Property-preserving hash functions are usually probabilistic: they use the random choice of a hash function from a family to achieve compression, and as a consequence, err on some inputs. Traditionally, the notion of correctness for these hash functions requires that for every two inputs x and y, the probability that h(x) and h(y) mislead us into a wrong prediction of P(x, y) is negligible. As observed in many recent works (incl. Mironov, Naor and Segev, STOC 2008; Hardt and Woodruff, STOC 2013; Naor and Yogev, CRYPTO 2015), such a correctness guarantee assumes that the adversary (who produces the offending inputs) has no information about the hash function, and is too weak in many scenarios.
We initiate the study of adversarial robustness for property-preserving hash functions, provide definitions, derive broad lower bounds due to a simple connection with communication complexity, and show the necessity of computational assumptions to construct such functions. Our main positive results are two candidate constructions of property-preserving hash functions (achieving different parameters) for the (promise) gap-Hamming property which checks if x and y are "too far" or "too close". Our first construction relies on generic collision-resistant hash functions, and our second on a variant of the syndrome decoding assumption on low-density parity check codes
On Generic Constructions of Circularly-Secure, Leakage-Resilient Public-Key Encryption Schemes
Abstract. We propose generic constructions of public-key encryption schemes, satisfying key- dependent message (KDM) security for projections and different forms of key-leakage resilience, from CPA-secure private key encryption schemes with two main abstract properties: (1) additive homomorphism with respect to both messages and randomness, and (2) reproducibility, providing a means for reusing encryption randomness across independent secret keys. More precisely, our construction transforms a private-key scheme with the stated properties (and one more mild condition) into a public-key one, providing:
- n-KDM-projection security, an extension of circular security, where the adversary may also ask for encryptions of negated secret key bits;
– a (1-o(1)) resilience rate in the bounded-memory leakage model of Akavia et al. (TCC 2009); and
– Auxiliary-input security against subexponentially-hard functions.
We introduce homomorphic weak pseudorandom functions, a homomorphic version of the weak PRFs proposed by Naor and Reingold (FOCS ’95) and use them to realize our base encryption scheme. We obtain homomorphic weak PRFs under assumptions including subgroup indistinguishability (implied, in particular, by QR and DCR) and homomorphic hash-proof systems (HHPS). As corollaries of our results, we obtain (1) a projection-secure encryption scheme (as well as a scheme with a (1-o(1)) resilience rate) based solely on the HHPS assumption, and (2) a unifying approach explaining the results of Boneh et al (CRYPTO ’08) and Brakerski and Goldwasser (CRYPTO ’10). Finally, by observing that Applebaum’s KDM amplification method (EUROCRYPT ’11) preserves both types of leakage resilience, we obtain schemes providing at the same time high leakage resilience and KDM security against any fixed polynomial-sized circuit family
Asynchronous Probabilistic Couplings in Higher-Order Separation Logic
Probabilistic couplings are the foundation for many probabilistic relational
program logics and arise when relating random sampling statements across two
programs. In relational program logics, this manifests as dedicated coupling
rules that, e.g., say we may reason as if two sampling statements return the
same value. However, this approach fundamentally requires aligning or
"synchronizing" the sampling statements of the two programs which is not always
possible.
In this paper, we develop Clutch, a higher-order probabilistic relational
separation logic that addresses this issue by supporting asynchronous
probabilistic couplings. We use Clutch to develop a logical step-indexed
logical relational to reason about contextual refinement and equivalence of
higher-order programs written in a rich language with higher-order local state
and impredicative polymorphism. Finally, we demonstrate the usefulness of our
approach on a number of case studies.
All the results that appear in the paper have been formalized in the Coq
proof assistant using the Coquelicot library and the Iris separation logic
framework
- …