35 research outputs found
Zero-error communication over networks
Zero-Error communication investigates communication without any error. By
defining channels without probabilities, results from Elias can be used to
completely characterize which channel can simulate which other channels. We
introduce the ambiguity of a channel, which completely characterizes the
possibility in principle of a channel to simulate any other channel. In the
second part we will look at networks of players connected by channels, while
some players may be corrupted. We will show how the ambiguity of a virtual
channel connecting two arbitrary players can be calculated. This means that we
can exactly specify what kind of zero-error communication is possible between
two players in any network of players connected by channels.Comment: 10 pages, full version of the paper presented at the 2004 IEEE
International Symposium on Information Theor
On the Efficiency of Classical and Quantum Secure Function Evaluation
We provide bounds on the efficiency of secure one-sided output two-party
computation of arbitrary finite functions from trusted distributed randomness
in the statistical case. From these results we derive bounds on the efficiency
of protocols that use different variants of OT as a black-box. When applied to
implementations of OT, these bounds generalize most known results to the
statistical case. Our results hold in particular for transformations between a
finite number of primitives and for any error. In the second part we study the
efficiency of quantum protocols implementing OT. While most classical lower
bounds for perfectly secure reductions of OT to distributed randomness still
hold in the quantum setting, we present a statistically secure protocol that
violates these bounds by an arbitrarily large factor. We then prove a weaker
lower bound that does hold in the statistical quantum setting and implies that
even quantum protocols cannot extend OT. Finally, we present two lower bounds
for reductions of OT to commitments and a protocol based on string commitments
that is optimal with respect to both of these bounds
Composable Security in the Bounded-Quantum-Storage Model
We present a simplified framework for proving sequential composability in the
quantum setting. In particular, we give a new, simulation-based, definition for
security in the bounded-quantum-storage model, and show that this definition
allows for sequential composition of protocols. Damgard et al. (FOCS '05,
CRYPTO '07) showed how to securely implement bit commitment and oblivious
transfer in the bounded-quantum-storage model, where the adversary is only
allowed to store a limited number of qubits. However, their security
definitions did only apply to the standalone setting, and it was not clear if
their protocols could be composed. Indeed, we first give a simple attack that
shows that these protocols are not composable without a small refinement of the
model. Finally, we prove the security of their randomized oblivious transfer
protocol in our refined model. Secure implementations of oblivious transfer and
bit commitment then follow easily by a (classical) reduction to randomized
oblivious transfer.Comment: 21 page
Oblivious transfer and quantum channels
We show that oblivious transfer can be seen as the classical analogue to a
quantum channel in the same sense as non-local boxes are for maximally
entangled qubits.Comment: Invited Paper at the 2006 IEEE Information Theory Workshop (ITW 2006
One-Shot Decoupling
If a quantum system A, which is initially correlated to another system, E,
undergoes an evolution separated from E, then the correlation to E generally
decreases. Here, we study the conditions under which the correlation disappears
(almost) completely, resulting in a decoupling of A from E. We give a criterion
for decoupling in terms of two smooth entropies, one quantifying the amount of
initial correlation between A and E, and the other characterizing the mapping
that describes the evolution of A. The criterion applies to arbitrary such
mappings in the general one-shot setting. Furthermore, the criterion is tight
for mappings that satisfy certain natural conditions. Decoupling has a number
of applications both in physics and information theory, e.g., as a building
block for quantum information processing protocols. As an example, we give a
one-shot state merging protocol and show that it is essentially optimal in
terms of its entanglement consumption/production.Comment: v2: improved converse theorem, v3: published versio
Information-Theoretic Conditions for Two-Party Secure Function Evaluation
The standard security definition of unconditional secure function evaluation, which is based on the ideal/real model paradigm, has the disadvantage of being overly complicated to work with in practice. On the other hand, simpler ad-hoc definitions tailored to special scenarios have often been flawed. Motivated by this unsatisfactory situation, we give an information-theoretic security definition of secure function evaluation which is very simple yet provably equivalent to the standard, simulation-based definitions
One-Shot Decoupling
If a quantum system A, which is initially correlated to another system, E, undergoes an evolution separated from E, then the correlation to E generally decreases. Here, we study the conditions under which the correlation disappears (almost) completely, resulting in a decoupling of A from E. We give a criterion for decoupling in terms of two smooth entropies, one quantifying the amount of initial correlation between A and E, and the other characterizing the mapping that describes the evolution of A. The criterion applies to arbitrary such mappings in the general one-shot setting. Furthermore, the criterion is tight for mappings that satisfy certain natural conditions. One-shot decoupling has a number of applications both in physics and information theory, e.g., as a building block for quantum information processing protocols. As an example, we give a one-shot state merging protocol and show that it is essentially optimal in terms of its entanglement consumption/production
On the Efficiency of Bit Commitment Reductions
Two fundamental building blocks of secure two-party computation are oblivious transfer and bit commitment. While there exist unconditionally secure implementations of oblivious transfer from noisy correlations or channels that achieve constant rates, similar constructions are not known for bit commitment.
In this paper we show that any protocol that implements instances of bit commitment with an error of at most needs at least instances of a given resource such as oblivious transfer or a noisy channel. This implies in particular that it is impossible to achieve a constant rate.
We then show that it is possible to circumvent the above lower bound by restricting the way in which the bit commitments can be opened. In the special case where only a constant number of instances can be opened, our protocol achieves a constant rate, which is optimal. Our protocol implements these restricted bit commitments from string commitments and is universally composable. The protocol provides significant speed-up over individual commitments in situations where restricted commitments are sufficient
Extended Validity and Consistency in Byzantine Agreement
A broadcast protocol allows a sender to distribute a value among a set of
players such that it is guaranteed that all players receive the same
value (consistency), and if the sender is honest, then all players
receive the sender\u27s value (validity). Classical broadcast protocols for
players provide security with respect to a fixed threshold ,
where both consistency and validity are guaranteed as long as at most
players are corrupted, and no security at all is guaranteed as soon as
players are corrupted. Depending on the environment, validity or
consistency may be the more important property.
We generalize the notion of broadcast by introducing an additional
threshold . In a {\em broadcast protocol with extended
validity}, both consistency and validity are achieved when no more than
players are corrupted, and validity is achieved even when up to
players are corrupted. Similarly, we define {\em broadcast with extended
consistency}. We prove that broadcast with extended validity as well as
broadcast with extended consistency is achievable if and only if
(or ).
For example, six players can achieve broadcast when at most one player is
corrupted (this result was known to be optimal), but they can even
achieve consistency (or validity) when two players are corrupted.
Furthermore, our protocols achieve {\em detection} in case of failure,
i.e., if at most players are corrupted then broadcast is achieved,
and if at most players are corrupted then broadcast is achieved or
every player learns that the protocol failed. This protocol can be
employed in the precomputation of a secure multi-party computation
protocol, resulting in {\em detectable multi-party computation}, where up
to corruptions can be tolerated and up to corruptions can
either be tolerated or detected in the precomputation, for any
with
Spatially resolved spectroscopy of monolayer graphene on SiO2
We have carried out scanning tunneling spectroscopy measurements on
exfoliated monolayer graphene on SiO to probe the correlation between its
electronic and structural properties. Maps of the local density of states are
characterized by electron and hole puddles that arise due to long range
intravalley scattering from intrinsic ripples in graphene and random charged
impurities. At low energy, we observe short range intervalley scattering which
we attribute to lattice defects. Our results demonstrate that the electronic
properties of graphene are influenced by intrinsic ripples, defects and the
underlying SiO substrate.Comment: 6 pages, 7 figures, extended versio