996 research outputs found
Semantic Security and Indistinguishability in the Quantum World
At CRYPTO 2013, Boneh and Zhandry initiated the study of quantum-secure
encryption. They proposed first indistinguishability definitions for the
quantum world where the actual indistinguishability only holds for classical
messages, and they provide arguments why it might be hard to achieve a stronger
notion. In this work, we show that stronger notions are achievable, where the
indistinguishability holds for quantum superpositions of messages. We
investigate exhaustively the possibilities and subtle differences in defining
such a quantum indistinguishability notion for symmetric-key encryption
schemes. We justify our stronger definition by showing its equivalence to novel
quantum semantic-security notions that we introduce. Furthermore, we show that
our new security definitions cannot be achieved by a large class of ciphers --
those which are quasi-preserving the message length. On the other hand, we
provide a secure construction based on quantum-resistant pseudorandom
permutations; this construction can be used as a generic transformation for
turning a large class of encryption schemes into quantum indistinguishable and
hence quantum semantically secure ones. Moreover, our construction is the first
completely classical encryption scheme shown to be secure against an even
stronger notion of indistinguishability, which was previously known to be
achievable only by using quantum messages and arbitrary quantum encryption
circuits.Comment: 37 pages, 2 figure
Homomorphic Encryption for Speaker Recognition: Protection of Biometric Templates and Vendor Model Parameters
Data privacy is crucial when dealing with biometric data. Accounting for the
latest European data privacy regulation and payment service directive,
biometric template protection is essential for any commercial application.
Ensuring unlinkability across biometric service operators, irreversibility of
leaked encrypted templates, and renewability of e.g., voice models following
the i-vector paradigm, biometric voice-based systems are prepared for the
latest EU data privacy legislation. Employing Paillier cryptosystems, Euclidean
and cosine comparators are known to ensure data privacy demands, without loss
of discrimination nor calibration performance. Bridging gaps from template
protection to speaker recognition, two architectures are proposed for the
two-covariance comparator, serving as a generative model in this study. The
first architecture preserves privacy of biometric data capture subjects. In the
second architecture, model parameters of the comparator are encrypted as well,
such that biometric service providers can supply the same comparison modules
employing different key pairs to multiple biometric service operators. An
experimental proof-of-concept and complexity analysis is carried out on the
data from the 2013-2014 NIST i-vector machine learning challenge
Cloud-based Quadratic Optimization with Partially Homomorphic Encryption
The development of large-scale distributed control systems has led to the
outsourcing of costly computations to cloud-computing platforms, as well as to
concerns about privacy of the collected sensitive data. This paper develops a
cloud-based protocol for a quadratic optimization problem involving multiple
parties, each holding information it seeks to maintain private. The protocol is
based on the projected gradient ascent on the Lagrange dual problem and
exploits partially homomorphic encryption and secure multi-party computation
techniques. Using formal cryptographic definitions of indistinguishability, the
protocol is shown to achieve computational privacy, i.e., there is no
computationally efficient algorithm that any involved party can employ to
obtain private information beyond what can be inferred from the party's inputs
and outputs only. In order to reduce the communication complexity of the
proposed protocol, we introduced a variant that achieves this objective at the
expense of weaker privacy guarantees. We discuss in detail the computational
and communication complexity properties of both algorithms theoretically and
also through implementations. We conclude the paper with a discussion on
computational privacy and other notions of privacy such as the non-unique
retrieval of the private information from the protocol outputs
High-Precision Arithmetic in Homomorphic Encryption
In most RLWE-based homomorphic encryption schemes the native plaintext elements are polynomials in a ring , where is a power of , and an integer modulus. For performing integer or rational number arithmetic one typically uses an encoding scheme, which converts the inputs to polynomials, and allows the result of the homomorphic computation to be decoded to recover the result as an integer or rational number respectively. The problem is that the modulus often needs to be extremely large to prevent the plaintext polynomial coefficients from being reduced modulo~ during the computation, which is a requirement for the decoding operation to work correctly. This results in larger noise growth, and prevents the evaluation of deep circuits, unless the encryption parameters are significantly increased.
We combine a trick of Hoffstein and Silverman, where the modulus is replaced by a polynomial , with the Fan-Vercauteren homomorphic encryption scheme. This yields a new scheme with a very convenient plaintext space . We then show how rational numbers can be encoded as elements of this plaintext space, enabling homomorphic evaluation of deep circuits with high-precision rational number inputs. We perform a fair and detailed comparison to the Fan-Vercauteren scheme with the Non-Adjacent Form encoder, and find that the new scheme significantly outperforms this approach. For example, when the new scheme allows us to evaluate circuits of depth with -bit integer inputs, in the same parameter setting the Fan-Vercauteren scheme only allows us to go up to depth . We conclude by discussing how known applications can benefit from the new scheme
Quantum Noise Randomized Ciphers
We review the notion of a classical random cipher and its advantages. We
sharpen the usual description of random ciphers to a particular mathematical
characterization suggested by the salient feature responsible for their
increased security. We describe a concrete system known as AlphaEta and show
that it is equivalent to a random cipher in which the required randomization is
effected by coherent-state quantum noise. We describe the currently known
security features of AlphaEta and similar systems, including lower bounds on
the unicity distances against ciphertext-only and known-plaintext attacks. We
show how AlphaEta used in conjunction with any standard stream cipher such as
AES (Advanced Encryption Standard) provides an additional, qualitatively
different layer of security from physical encryption against known-plaintext
attacks on the key. We refute some claims in the literature that AlphaEta is
equivalent to a non-random stream cipher.Comment: Accepted for publication in Phys. Rev. A; Discussion augmented and
re-organized; Section 5 contains a detailed response to 'T. Nishioka, T.
Hasegawa, H. Ishizuka, K. Imafuku, H. Imai: Phys. Lett. A 327 (2004) 28-32
/quant-ph/0310168' & 'T. Nishioka, T. Hasegawa, H. Ishizuka, K. Imafuku, H.
Imai: Phys. Lett. A 346 (2005) 7
Secure data sharing and processing in heterogeneous clouds
The extensive cloud adoption among the European Public Sector Players empowered them to own and operate a range of cloud infrastructures. These deployments vary both in the size and capabilities, as well as in the range of employed technologies and processes. The public sector, however, lacks the necessary technology to enable effective, interoperable and secure integration of a multitude of its computing clouds and services. In this work we focus on the federation of private clouds and the approaches that enable secure data sharing and processing among the collaborating infrastructures and services of public entities. We investigate the aspects of access control, data and security policy languages, as well as cryptographic approaches that enable fine-grained security and data processing in semi-trusted environments. We identify the main challenges and frame the future work that serve as an enabler of interoperability among heterogeneous infrastructures and services. Our goal is to enable both security and legal conformance as well as to facilitate transparency, privacy and effectivity of private cloud federations for the public sector needs. © 2015 The Authors
暗号要素技術の一般的構成を介した高い安全性・高度な機能を備えた暗号要素技術の構成
Recent years have witnessed an active research on cryptographic primitives with complex functionality beyond simple encryption or authentication. A cryptographic primitive is required to be proposed together with a formal model of its usage and a rigorous proof of security under that model.This approach has suffered from the two drawbacks: (1) security models are defined in a very specific manner for each primitive, which situation causes the relationship between these security models not to be very clear, and (2) no comprehensive ways to confirm that a formal model of security really captures every possible scenarios in practice.This research relaxes these two drawbacks by the following approach: (1) By observing the fact that a cryptographic primitive A should be crucial for constructing another primitive B, we identify an easy-to-understand approach for constructing various cryptographic primitives.(2) Consider a situation in which there are closely related cryptographic primitives A and B, and the primitive A has no known security requirement that corresponds to some wellknown security requirement (b) for the latter primitive B.We argue that this situation suggests that this unknown security requirement for A can capture some practical attack. This enables us to detect unknown threats for various cryptographic primitives that have been missed bythe current security models.Following this approach, we identify an overlooked security threat for a cryptographic primitive called group signature. Furthermore, we apply the methodology (2) to the “revocable”group signature and obtain a new extension of public-key encryption which allows to restrict a plaintext that can be securely encrypted.通常の暗号化や認証にとどまらず, 複雑な機能を備えた暗号要素技術の提案が活発になっている. 暗号要素技術の安全性は利用形態に応じて, セキュリティ上の脅威をモデル化して安全性要件を定め, 新方式はそれぞれ安全性定義を満たすことの証明と共に提案される.既存研究では, 次の問題があった: (1) 要素技術ごとに個別に安全性の定義を与えているため, 理論的な体系化が不十分であった. (2) 安全性定義が実用上の脅威を完全に捉えきれているかの検証が難しかった.本研究は上記の問題を次の考え方で解決する. (1) ある要素技術(A) を構成するには別の要素技術(B) を部品として用いることが不可欠であることに注目し, 各要素技術の安全性要件の関連を整理・体系化して, 新方式を見通し良く構成可能とする. (2) 要素技術(B)で考慮されていた安全性要件(b) に対応する要素技術(A) の安全性要件が未定義なら, それを(A) の新たな安全性要件(a) として定式化する. これにより未知の脅威の検出が容易になる.グループ署名と非対話開示機能付き公開鍵暗号という2 つの要素技術について上記の考え方を適用して, グループ署名について未知の脅威を指摘する.また, 証明書失効機能と呼ばれる拡張機能を持つグループ署名に上記の考え方を適用して, 公開鍵暗号についての新たな拡張機能である, 暗号化できる平文を制限できる公開鍵暗号の効率的な構成法を明らかにする.電気通信大学201
Quantum Key Distribution
This chapter describes the application of lasers, specifically diode lasers,
in the area of quantum key distribution (QKD). First, we motivate the
distribution of cryptographic keys based on quantum physical properties of
light, give a brief introduction to QKD assuming the reader has no or very
little knowledge about cryptography, and briefly present the state-of-the-art
of QKD. In the second half of the chapter we describe, as an example of a
real-world QKD system, the system deployed between the University of Calgary
and SAIT Polytechnic. We conclude the chapter with a brief discussion of
quantum networks and future steps.Comment: 20 pages, 12 figure
- …