71 research outputs found
The ghost in the radiation: robust encodings of the black hole interior
We reconsider the black hole firewall puzzle, emphasizing that quantum error- correction, computational complexity, and pseudorandomness are crucial concepts for understanding the black hole interior. We assume that the Hawking radiation emitted by an old black hole is pseudorandom, meaning that it cannot be distinguished from a perfectly thermal state by any efficient quantum computation acting on the radiation alone. We then infer the existence of a subspace of the radiation system which we interpret as an encoding of the black hole interior. This encoded interior is entangled with the late outgoing Hawking quanta emitted by the old black hole, and is inaccessible to computationally bounded observers who are outside the black hole. Specifically, efficient operations acting on the radiation, those with quantum computational complexity polynomial in the entropy of the remaining black hole, commute with a complete set of logical operators acting on the encoded interior, up to corrections which are exponentially small in the entropy. Thus, under our pseudorandomness assumption, the black hole interior is well protected from exterior observers as long as the remaining black hole is macroscopic. On the other hand, if the radiation is not pseudorandom, an exterior observer may be able to create a firewall by applying a polynomial-time quantum computation to the radiation
The ghost in the radiation: robust encodings of the black hole interior
We reconsider the black hole firewall puzzle, emphasizing that quantum error- correction, computational complexity, and pseudorandomness are crucial concepts for understanding the black hole interior. We assume that the Hawking radiation emitted by an old black hole is pseudorandom, meaning that it cannot be distinguished from a perfectly thermal state by any efficient quantum computation acting on the radiation alone. We then infer the existence of a subspace of the radiation system which we interpret as an encoding of the black hole interior. This encoded interior is entangled with the late outgoing Hawking quanta emitted by the old black hole, and is inaccessible to computationally bounded observers who are outside the black hole. Specifically, efficient operations acting on the radiation, those with quantum computational complexity polynomial in the entropy of the remaining black hole, commute with a complete set of logical operators acting on the encoded interior, up to corrections which are exponentially small in the entropy. Thus, under our pseudorandomness assumption, the black hole interior is well protected from exterior observers as long as the remaining black hole is macroscopic. On the other hand, if the radiation is not pseudorandom, an exterior observer may be able to create a firewall by applying a polynomial-time quantum computation to the radiation
On Pseudorandom Encodings
We initiate a study of pseudorandom encodings: efficiently computable and decodable encoding functions that map messages from a given distribution to a random-looking distribution. For instance, every distribution that can be perfectly and efficiently compressed admits such a pseudorandom encoding. Pseudorandom encodings are motivated by a variety of cryptographic applications, including password-authenticated key exchange, “honey encryption” and steganography. The main question we ask is whether every efficiently samplable distribution admits a pseudorandom encoding. Under different cryptographic assumptions, we obtain positive and negative answers for different flavors of pseudorandom encodings, and relate this question to problems in other areas of cryptography. In particular, by establishing a twoway relation between pseudorandom encoding schemes and efficient invertible sampling algorithms, we reveal a connection between adaptively secure multiparty computation for randomized functionalities and questions in the domain of steganography
On Foundations of Protecting Computations
Information technology systems have become indispensable to uphold our
way of living, our economy and our safety. Failure of these systems can have
devastating effects. Consequently, securing these systems against malicious
intentions deserves our utmost attention.
Cryptography provides the necessary foundations for that purpose. In
particular, it provides a set of building blocks which allow to secure larger
information systems. Furthermore, cryptography develops concepts and tech-
niques towards realizing these building blocks. The protection of computations
is one invaluable concept for cryptography which paves the way towards
realizing a multitude of cryptographic tools. In this thesis, we contribute to
this concept of protecting computations in several ways.
Protecting computations of probabilistic programs. An indis-
tinguishability obfuscator (IO) compiles (deterministic) code such that it
becomes provably unintelligible. This can be viewed as the ultimate way
to protect (deterministic) computations. Due to very recent research, such
obfuscators enjoy plausible candidate constructions.
In certain settings, however, it is necessary to protect probabilistic com-
putations. The only known construction of an obfuscator for probabilistic
programs is due to Canetti, Lin, Tessaro, and Vaikuntanathan, TCC, 2015 and
requires an indistinguishability obfuscator which satisfies extreme security
guarantees. We improve this construction and thereby reduce the require-
ments on the security of the underlying indistinguishability obfuscator.
(Agrikola, Couteau, and Hofheinz, PKC, 2020)
Protecting computations in cryptographic groups. To facilitate
the analysis of building blocks which are based on cryptographic groups,
these groups are often overidealized such that computations in the group
are protected from the outside. Using such overidealizations allows to prove
building blocks secure which are sometimes beyond the reach of standard
model techniques. However, these overidealizations are subject to certain
impossibility results. Recently, Fuchsbauer, Kiltz, and Loss, CRYPTO, 2018
introduced the algebraic group model (AGM) as a relaxation which is closer
to the standard model but in several aspects preserves the power of said
overidealizations. However, their model still suffers from implausibilities.
We develop a framework which allows to transport several security proofs
from the AGM into the standard model, thereby evading the above implausi-
bility results, and instantiate this framework using an indistinguishability
obfuscator.
(Agrikola, Hofheinz, and Kastner, EUROCRYPT, 2020)
Protecting computations using compression. Perfect compression
algorithms admit the property that the compressed distribution is truly
random leaving no room for any further compression. This property is
invaluable for several cryptographic applications such as “honey encryption”
or password-authenticated key exchange. However, perfect compression
algorithms only exist for a very small number of distributions. We relax the
notion of compression and rigorously study the resulting notion which we
call “pseudorandom encodings”. As a result, we identify various surprising
connections between seemingly unrelated areas of cryptography. Particularly,
we derive novel results for adaptively secure multi-party computation which
allows for protecting computations in distributed settings. Furthermore, we
instantiate the weakest version of pseudorandom encodings which suffices
for adaptively secure multi-party computation using an indistinguishability
obfuscator.
(Agrikola, Couteau, Ishai, Jarecki, and Sahai, TCC, 2020
Assisted Common Information: Further Results
We presented assisted common information as a generalization of
G\'acs-K\"orner (GK) common information at ISIT 2010. The motivation for our
formulation was to improve upperbounds on the efficiency of protocols for
secure two-party sampling (which is a form of secure multi-party computation).
Our upperbound was based on a monotonicity property of a rate-region (called
the assisted residual information region) associated with the assisted common
information formulation. In this note we present further results. We explore
the connection of assisted common information with the Gray-Wyner system. We
show that the assisted residual information region and the Gray-Wyner region
are connected by a simple relationship: the assisted residual information
region is the increasing hull of the Gray-Wyner region under an affine map.
Several known relationships between GK common information and Gray-Wyner system
fall out as consequences of this. Quantities which arise in other source coding
contexts acquire new interpretations. In previous work we showed that assisted
common information can be used to derive upperbounds on the rate at which a
pair of parties can {\em securely sample} correlated random variables, given
correlated random variables from another distribution. Here we present an
example where the bound derived using assisted common information is much
better than previously known bounds, and in fact is tight. This example
considers correlated random variables defined in terms of standard variants of
oblivious transfer, and is interesting on its own as it answers a natural
question about these cryptographic primitives.Comment: 8 pages, 3 figures, 1 appendix; to be presented at the IEEE
International Symposium on Information Theory, 201
On the Complexity of Breaking Pseudoentropy
Pseudoentropy has found a lot of important applications to cryptography and complexity theory.
In this paper we focus on the foundational problem that has not been investigated so far, namely
by how much pseudoentropy (the amount seen by computationally bounded attackers) differs from its information-theoretic counterpart (seen by unbounded observers), given certain limits on attacker\u27s computational power?
We provide the following answer for HILL pseudoentropy, which exhibits a \emph{threshold behavior}
around the size exponential in the entropy amount:
\begin{itemize}
\item If the attacker size () and advantage () satisfy where is the claimed amount of pseudoentropy, then the pseudoentropy boils down to the information-theoretic smooth entropy
\item If then pseudoentropy could be arbitrarily bigger than the information-theoretic smooth entropy
\end{itemize}
Besides answering the posted question, we show an elegant application of our result to the complexity theory, namely
that it implies the classical result on the existence of functions hard to approximate (due to Pippenger).
In our approach we utilize non-constructive techniques: the duality of linear programming and the probabilistic method
Complexity, parallel computation and statistical physics
The intuition that a long history is required for the emergence of complexity
in natural systems is formalized using the notion of depth. The depth of a
system is defined in terms of the number of parallel computational steps needed
to simulate it. Depth provides an objective, irreducible measure of history
applicable to systems of the kind studied in statistical physics. It is argued
that physical complexity cannot occur in the absence of substantial depth and
that depth is a useful proxy for physical complexity. The ideas are illustrated
for a variety of systems in statistical physics.Comment: 21 pages, 7 figure
- …