302 research outputs found
Optimal Data Authentication from Directed Transitive Signatures
An authenticated dictionary of size is said to be optimal when an update operation or proof computation requires at most accesses to the data-structure, and the size of a proof is with respect to .
In this note we show that an optimal authenticated dictionary (OAD) can be built using transitive signatures for directed graphs (DTS). As the existence of DTS and OAD are both still open, our result can be interpreted as following: if optimal authenticated dictionaries do not exist then transitive signatures for directed graphs do not exist either
Range-Based Set Reconciliation and Authenticated Set Representations
Range-based set reconciliation is a simple approach to efficiently computing
the union of two sets over a network, based on recursively partitioning the
sets and comparing fingerprints of the partitions to probabilistically detect
whether a partition requires further work. Whereas prior presentations of this
approach focus on specific fingerprinting schemes for specific use-cases, we
give a more generic description and analysis in the broader context of set
reconciliation. Precisely capturing the design space for fingerprinting schemes
allows us to survey for cryptographically secure schemes. Furthermore, we
reduce the time complexity of local computations by a logarithmic factor
compared to previous publications. In investigating secure associative hash
functions, we open up a new class of tree-based authenticated data structures
which need not prescribe a deterministic balancing scheme
Still Wrong Use of Pairings in Cryptography
Several pairing-based cryptographic protocols are recently proposed with a
wide variety of new novel applications including the ones in emerging
technologies like cloud computing, internet of things (IoT), e-health systems
and wearable technologies. There have been however a wide range of incorrect
use of these primitives. The paper of Galbraith, Paterson, and Smart (2006)
pointed out most of the issues related to the incorrect use of pairing-based
cryptography. However, we noticed that some recently proposed applications
still do not use these primitives correctly. This leads to unrealizable,
insecure or too inefficient designs of pairing-based protocols. We observed
that one reason is not being aware of the recent advancements on solving the
discrete logarithm problems in some groups. The main purpose of this article is
to give an understandable, informative, and the most up-to-date criteria for
the correct use of pairing-based cryptography. We thereby deliberately avoid
most of the technical details and rather give special emphasis on the
importance of the correct use of bilinear maps by realizing secure
cryptographic protocols. We list a collection of some recent papers having
wrong security assumptions or realizability/efficiency issues. Finally, we give
a compact and an up-to-date recipe of the correct use of pairings.Comment: 25 page
Vector Commitments with Efficient Updates
Dynamic vector commitments that enable local updates of opening proofs have
applications ranging from verifiable databases with membership changes to
stateless clients on blockchains. In these applications, each user maintains a
relevant subset of the committed messages and the corresponding opening proofs
with the goal of ensuring a succinct global state. When the messages are
updated, users are given some global update information and update their
opening proofs to match the new vector commitment. We investigate the relation
between the size of the update information and the runtime complexity needed to
update an individual opening proof. Existing vector commitment schemes require
that either the information size or the runtime scale linearly in the number k
of updated state elements. We construct a vector commitment scheme that
asymptotically achieves both length and runtime that is sublinear in k. We
prove an information-theoretic lower bound on the relation between the update
information size and runtime complexity that shows the asymptotic optimality of
our scheme. While in practice, the construction is not yet competitive with
Verkle commitments, our approach may point the way towards more performant
vector commitments
Efficient KEA-Style Lattice-Based Authenticated Key Exchange
Lattice-based cryptographic primitives are believed to have the property against attacks by quantum computers. In this work, we present a KEA-style authenticated key exchange protocol based on the ring learning with errors problem whose security is proven in the BR model with weak perfect forward secrecy. With properties of KEA such as implicit key authentication and simplicity, our protocol also enjoys many properties of lattice-based cryptography, namely asymptotic efficiency, conceptual simplicity, worst-case hardness assumption, and resistance to attacks by quantum computers. Our lattice-based authenticated key exchange protocol is more efficient than the protocol of Zhang et al. (EUROCRYPT 2015) with more concise structure, smaller key size and lower bandwidth. Also, our protocol enjoys the advantage of optimal online efficiency and we improve our protocol with pre-computation
An improved Framework for Biometric Database’s privacy
Security and privacy are huge challenges in biometric systems. Biometrics are sensitive data that should be protected from any attacker and especially attackers targeting the confidentiality and integrity of biometric data. In this paper an extensive review of different physiological biometric techniques is provided. A comparative analysis of the various sus mentioned biometrics, including characteristics and properties is conducted. Qualitative and quantitative evaluation of the most relevant physiological biometrics is achieved. Furthermore, we propose a new framework for biometric database privacy. Our approach is based on the use of the promising fully homomorphic encryption technology. As a proof of concept, we establish an initial implementation of our security module using JAVA programming language
Incremental Composition in Distributional Semantics
Despite the incremental nature of Dynamic Syntax (DS), the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. Building on a proposal in our previous work, we implement and evaluate our model on real data, showing that it outperforms a commonly used additive baseline. In conclusion, we argue that these results set the ground for an account of the non-determinism of lexical content, in which the nature of word meaning is its dependence on surrounding context for its construal
- …