279 research outputs found

    On Security Notions for Multi-Party Computation

    Get PDF
    Die meisten Sicherheitsbegriffe, die heutzutage benutzt werden, stammen aus den 1980ern. Doch durch ein seitdem besseres Verständnis der Theorie stellt sich die Frage, ob sie nicht weiterentwickelt werden können. Ein begrenzender Faktor sind hierbei sogenannte Unmöglichkeitsbeweise, die mathematisch beweisen, welche Sicherheitsgarantien nicht erfüllt werden können. Diese liefern einen begrenzenden Faktor, ihre Aussage sollte jedoch nicht übertrieben werden. Der Beweis ist nur in seinem eigenen Setting gültig und deckt nur genau den einen Sicherheitsbegriff ab. Historisch haben sich die etablierten Sicherheitsbegriffe jedoch zu etwas deutlich schwächerem entwickelt, wodurch eine Lücke zwischen dem entstanden ist, was praktisch benutzt wird, und dem, was bekanntermaßen unmöglich ist. In dieser Promotion zeigen wir einige dieser Lücken auf und untersuchen Sicherheitsbegriffe, die mit Sicherer Mehrparteienberechnung (MPC) zusammenhängen, und die zwischen den Etablierten und den Unmöglichen liegen. Abbildung von Geschäftsmodellen und Gesetzlichen Regelungen in MPC. Mit Sicherer Mehrparteienberechnung (MPC) können Parteien eine Funktion über privaten Eingaben auf sichere Weise so berechnen, dass nichts über die Eingaben der anderen Parteien bekannt wird außer die Ausgabe der Funktion. Heutzutage hat MPC nur einen vergleichsweise geringen Mehraufwand im Vergleich zur direkten Berechnung. Und obwohl Datensparsamkeit in der Praxis belohnt wird, wird MPC kaum benutzt. Wir glauben dass einer der Gründe dafür, dass MPC in Praxis kaum benutzt wird, darin liegt, dass es Geschäftsmodelle und gesetzliche Regelungen ignoriert die eine gewisse Leakage der Daten benötigen, während allgemeines MPC auf fast-perfekte Privatsphäre hinarbeitet. Wir präsentieren einen neuen Baustein, der es Geschäften---die durch einen zentralen Operator repräsentiert werden---ermöglicht, effizient die gewünschte Menge an Leakage abzubilden, die benötigt wird, um das Geschäft aufrechtzuerhalten oder um gesetzliche Vorgaben zu erfüllen, während Nutzer anonym und ohne durch mehrere Interaktionen hinweg verlinkt werden können Daten sammeln. Wir modellieren die Anforderungen im Universal Composability (UC) Framework. Dadurch wird garantiert, dass die Sicherheitsgarantien unabhängig davon halten, welche Protokolle parallel ausgeführt werden. Trotz dieser starken Sicherheitsgarantien ist das Protokoll dabei effizient genug, um auf moderner Hardware ausgeführt zu werden, selbst wenn der Nutzer die Daten auf Smartphones mit beschränkter Rechenleistung sammeln. (Fetzer, Keller, Maier, Raiber, Rupp, Schwerdt, PETS 2022) Eine Instantiierung stärkerer Commitments. Mit einem Bit Commitment Schema kann sich ein Sender gegenüber eines Empfängers auf ein Bit festlegen, ohne das dabei zu offenbaren (hiding), aber auf eine Art die es dem Sender nicht erlaubt, den Empfänger später davon zu überzeugen, dass das Commitment auf ein anderes Bit festgelegt wurde (binding). In der Quantenwelt sind Commitments stark genug, um MPC zu konstruieren, weswegen es einen Anreiz gibt, Commitments so sicher wie möglich zu machen; jedoch sagen Unmöglichkeitsbeweise aus, dass beide Sicherheitsbegriffe -- hiding und binding -- gleichzeitig nicht bedingungslos halten können. Als Konsequenz weichen moderne Bit Commitment Schemas eine Sicherheitseigenschaft auf, die dann nur noch computationally halten, also auf Grundlage komplexitätstheoretischer Annahmen. Wir stellen das erste Bit Commitment Protokoll im Quantum Random Oracle Modle (QROM) vor, das bedingungslose Sicherheit für den Empfänger (binding) und langfristige Sicherheit für den Sender (hiding) bietet und das dabei keine Zusatzhardware benötigt. Unser Resultat basiert auf einer neuen Annahme über die Schwierigkeit, Quantenzustände über einen langen Zeitraum zu speichern. Langfristige Sicherheit modelliert technischen Fortschritt des Angreifers, da Transkripte, die heutzutage nicht effizient gebrochen werden können, in Zukunft vielleicht einfach extrahierbar sind, sobald schnellere Maschinen verfügbar sind. Wir beweisen die Sicherheit des Commitment Protokolls im QROM unter oben genannter Annahme und zeigen, dass eine Instantiierung im Standardmodell zu einem neuen Angriff auf die langfristige Hiding-Eigenschaft zulässt. (Döttling, Koch, Maier, Mechler, Müller, Müller-Quade, Tiepelt, IN EINREICHUNG) Undetectable Multi-Party Computation. Covert MPC ist eine Erweiterung von MPC, die nicht nur die Eingaben versteckt, sondern das gesamte Vorhandensein der Berechnung. Teilnehmer lernen nur dann die Ausgabe, wenn alle anderen Parteien das Protokoll ausgeführt haben und die Ausgabe für alle Parteien vorteilhaft ist. Anderenfalls lernen die Teilnehmer nichts, nicht mal, welche anderen Parteien versucht haben, an der Berechnung teilzunehmen. Ein einzelner Nichtteilnehmer kann unabsichtlich die gesamte Berechnung abbrechen. Daher stellt sich die Frage: können NN Teilnehmer eine Berechnung ausführen, während K>NK > N Parteien anwesend sind, und bei der die Ausgabe nur von den Eingaben der NN Teilnehmer abhängt, während die Identität der anderen Teilnehmer unter den anwesenden Parteien versteckt wird? Dies sollte insbesondere dann gelten, wenn die restlichen Parteien nicht wissen, dass eine Berechnung im Gang ist. Wir verknüpfen diese Frage mit der theoretischen Machbarkeit von Anonymen Whistleblowing, bei dem eine einzelne Partei versucht, eine Nachricht preiszugeben, ohne dabei die eigene Identität zu offenbaren und ohne dass sich die anderen Parteien auf irgendeine besondere Art verhalten müssen. Leider zeigen wir dass keine Primitive sowohl Korrektheit und Anonymität mit überwältigender Wahrscheinlichkeit im asymptotischen Setting erreichen kann, selbst unter sehr starken Annahmen. Jedoch konstruieren wir eine heuristische Instantiierung im Fine-Grained setting mit überwältigender Korrektheit und jeder beliebigen Ziel-Anonymität. Unsere Ergebnisse liefern starke Grundlagen für die Untersuchung der Möglichkeit von Anonymen Nachrichtentransfer durch authentifizierte Kanäle, ein faszinierendes Ziel von dem wir glauben, dass es von grundlegendem Interesse ist. (Agrikola, Couteau, Maier, TCC 2022

    Classical and quantum sublinear algorithms

    Get PDF
    This thesis investigates the capabilities of classical and quantum sublinear algorithms through the lens of complexity theory. The formal classification of problems between “tractable” (by constructing efficient algorithms that solve them) and “intractable” (by proving no efficient algorithm can) is among the most fruitful lines of work in theoretical computer science, which includes, amongst an abundance of fundamental results and open problems, the notorious P vs. NP question. This particular incarnation of the decision-versus-verification question stems from a choice of computational model: polynomial-time Turing machines. It is far from the only model worthy of investigation, however; indeed, measuring time up to polynomial factors is often too “coarse” for practical applications. We focus on quantum computation, a more complete model of physically realisable computation where quantum mechanical phenomena (such as interference and entanglement) may be used as computational resources; and sublinear algorithms, a formalisation of ultra-fast computation where merely reading or storing the entire input is impractical, e.g., when processing massive datasets such as social networks or large databases. We begin our investigation by studying structural properties of local algorithms, a large class of sublinear algorithms that includes property testers and is characterised by the inability to even see most of the input. We prove that, in this setting, queries – the main complexity measure – can be replaced with random samples. Applying this transformation yields, among other results, the state-of-the-art query lower bound for relaxed local decoders. Focusing our attention onto property testers, we begin to chart the complexity�theoretic landscape arising from the classical vs. quantum and decision vs. verification questions in testing. We show that quantum hardware and communication with a powerful but untrusted prover are “orthogonal” resources, so that one cannot be substituted for the other. This implies all of the possible separations among the analogues of QMA, MA and BQP in the property-testing setting. We conclude with a study of zero-knowledge for (classical) streaming algorithms, which receive one-pass access to the entirety of their input but only have sublinear space. Inspired by cryptographic tools, we construct commitment protocols that are unconditionally secure in the streaming model and can be leveraged to obtain zero-knowledge streaming interactive proofs – and, in particular, show that zero-knowledge is achievable in this model

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    From Information Theory Puzzles in Deletion Channels to Deniability in Quantum Cryptography

    Get PDF
    Research questions, originally rooted in quantum key exchange (QKE), have branched off into independent lines of inquiry ranging from information theory to fundamental physics. In a similar vein, the first part of this thesis is dedicated to information theory problems in deletion channels that arose in the context of QKE. From the output produced by a memoryless deletion channel with a uniformly random input of known length n, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length m. We first conjecture on the basis of experimental data that the entropy of the posterior is minimized by the constant strings 000..., 111... and maximized by the alternating strings 0101..., 1010.... Among other things, we derive analytic expressions for minimal entropy and propose alternative approaches for tackling the entropy extremization problem. We address a series of closely related combinatorial problems involving binary (sub/super)-sequences and prove the original minimal entropy conjecture for the special cases of single and double deletions using clustering techniques and a run-length encoding of strings. The entropy analysis culminates in a fundamental characterization of the extremal entropic cases in terms of the distribution of embeddings. We confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics by showing how the analytic-combinatorial methods of Flajolet, Szpankowski and Vallée, relying on generating functions, can be applied to resolve the case of fixed output length and n → ∞. In the second part, we revisit the notion of deniability in QKE, a topic that remains largely unexplored. In a work by Donald Beaver it is argued that QKE protocols are not necessarily deniable due to an eavesdropping attack that limits key equivocation. We provide more insight into the nature of this attack and discuss how it extends to other prepare-and-measure QKE schemes such as QKE obtained from uncloneable encryption. We adopt the framework for quantum authenticated key exchange developed by Mosca et al. and extend it to introduce the notion of coercer-deniable QKE, formalized in terms of the indistinguishability of real and fake coercer views. We also elaborate on the differences between our model and the standard simulation-based definition of deniable key exchange in the classical setting. We establish a connection between the concept of covert communication and deniability by applying results from a work by Arrazola and Scarani on obtaining covert quantum communication and covert QKE to propose a simple construction for coercer-deniable QKE. We prove the deniability of this scheme via a reduction to the security of covert QKE. We relate deniability to fundamental concepts in quantum information theory and suggest a generic approach based on entanglement distillation for achieving information-theoretic deniability, followed by an analysis of other closely related results such as the relation between the impossibility of unconditionally secure quantum bit commitment and deniability. Finally, we present an efficient coercion-resistant and quantum-secure voting scheme, based on fully homomorphic encryption (FHE) and recent advances in various FHE primitives such as hashing, zero-knowledge proofs of correct decryption, verifiable shuffles and threshold FHE

    More Efficient Two-Round Multi-Signature Scheme with Provably Secure Parameters

    Get PDF
    In this paper, we propose the first two-round multi-signature scheme that can guarantee 128-bit security under a standardized EC in concrete security without using the Algebraic Group Model (AGM). To construct our scheme, we introduce a new technique to tailor a certain special homomorphic commitment scheme for the use with the Katz-Wang DDH-based signature scheme. We prove that an EC with at least a 321-bit order is sufficient for our scheme to have the standard 128-bit security. This means that it is easy for our scheme to implement in practice because we can use the NIST-standardized EC P-384 for 128-bit security. The signature size of our proposed scheme under P-384 is 1152 bits, which is the smallest size among the existing schemes without using the AGM. Our experiment on an ordinary machine shows that for signing and verification, each can be completed in about 65 ms under 100 signers. This shows that our scheme has sufficiently reasonable running time in practice

    Theory and Practice of Cryptography and Network Security Protocols and Technologies

    Get PDF
    In an age of explosive worldwide growth of electronic data storage and communications, effective protection of information has become a critical requirement. When used in coordination with other tools for ensuring information security, cryptography in all of its applications, including data confidentiality, data integrity, and user authentication, is a most powerful tool for protecting information. This book presents a collection of research work in the field of cryptography. It discusses some of the critical challenges that are being faced by the current computing world and also describes some mechanisms to defend against these challenges. It is a valuable source of knowledge for researchers, engineers, graduate and doctoral students working in the field of cryptography. It will also be useful for faculty members of graduate schools and universities

    근사 연산에 대한 계산 검증 연구

    Get PDF
    학위논문(박사)--서울대학교 대학원 :자연과학대학 수리과학부,2020. 2. 천정희.Verifiable Computing (VC) is a complexity-theoretic method to secure the integrity of computations. The need is increasing as more computations are outsourced to untrusted parties, e.g., cloud platforms. Existing techniques, however, have mainly focused on exact computations, but not approximate arithmetic, e.g., floating-point or fixed-point arithmetic. This makes it hard to apply them to certain types of computations (e.g., machine learning, data analysis, and scientific computation) that inherently require approximate arithmetic. In this thesis, we present an efficient interactive proof system for arithmetic circuits with rounding gates that can represent approximate arithmetic. The main idea is to represent the rounding gate into a small sub-circuit, and reuse the machinery of the Goldwasser, Kalai, and Rothblum's protocol (also known as the GKR protocol) and its recent refinements. Specifically, we shift the algebraic structure from a field to a ring to better deal with the notion of ``digits'', and generalize the original GKR protocol over a ring. Then, we represent the rounding operation by a low-degree polynomial over a ring, and develop a novel, optimal circuit construction of an arbitrary polynomial to transform the rounding polynomial to an optimal circuit representation. Moreover, we further optimize the proof generation cost for rounding by employing a Galois ring. We provide experimental results that show the efficiency of our system for approximate arithmetic. For example, our implementation performed two orders of magnitude better than the existing system for a nested 128 x 128 matrix multiplication of depth 12 on the 16-bit fixed-point arithmetic.계산검증 기술은 계산의 무결성을 확보하기 위한 계산 복잡도 이론적 방법이다. 최근 많은 계산이 클라우드 플랫폼과 같은 제3자에게 외주됨에 따라 그 필요성이 증가하고 있다. 그러나 기존의 계산검증 기술은 비근사 연산만을 고려했을 뿐, 근사 연산 (부동 소수점 또는 고정 소수점 연산)은 고려하지 않았다. 따라서 본질적으로 근사 연산이 필요한 특정 유형의 계산 (기계 학습, 데이터 분석 및 과학 계산 등)에 적용하기 어렵다는 문제가 있었다. 이 논문은 반올림 게이트를 수반하는 산술 회로를 위한 효율적인 대화형 증명 시스템을 제시한다. 이러한 산술 회로는 근사 연산을 효율적으로 표현할 수 있으므로, 근사 연산에 대한 효율적인 계산 검증이 가능하다. 주요 아이디어는 반올림 게이트를 작은 회로로 변환한 후, 여기에 Goldwasser, Kalai, 및 Rothblum의 프로토콜 (GKR 프로토콜)과 최근의 개선을 적용하는 것이다. 구체적으로, 대수적 객체를 유한체가 아닌 ``숫자''를 보다 잘 처리할 수 있는 환으로 치환한 후, 환 위에서 적용 가능하도록 기존의 GKR 프로토콜을 일반화하였다. 이후, 반올림 연산을 환에서 차수가 낮은 다항식으로 표현하고, 다항식 연산을 최적의 회로 표현으로 나타내는 새롭고 최적화된 회로 구성을 개발하였다. 또한, 갈루아 환을 사용하여 반올림을 위한 증명 생성 비용을 더욱 최적화하였다. 마지막으로, 실험을 통해 우리의 근사 연산 검증 시스템의 효율성을 확인하였다. 예를 들어, 우리의 시스템은 구현 시, 16 비트 고정 소수점 연산을 통한 깊이 12의 반복된 128 x 128 행렬 곱셈의 검증에 있어 기존 시스템보다 약 100배 더 나은 성능을 보인다.1 Introduction 1 1.1 Verifiable Computing 2 1.2 Verifiable Approximate Arithmetic 3 1.2.1 Problem: Verification of Rounding Arithmetic 3 1.2.2 Motivation: Verifiable Machine Learning (AI) 4 1.3 List of Papers 5 2 Preliminaries 6 2.1 Interactive Proof and Argument 6 2.2 Sum-Check Protocol 7 2.3 The GKR Protocol 10 2.4 Notation and Cost Model 14 3 Related Work 15 3.1 Interactive Proofs 15 3.2 (Non-)Interactive Arguments 17 4 Interactive Proof for Rounding Arithmetic 20 4.1 Overview of Our Approach and Result 20 4.2 Interactive Proof over a Ring 26 4.2.1 Sum-Check Protocol over a Ring 27 4.2.2 The GKR Protocol over a Ring 29 4.3 Verifiable Rounding Operation 31 4.3.1 Lowest-Digit-Removal Polynomial over Z_{p^e} 32 4.3.2 Verification of Division-by-p Layer 33 4.4 Delegation of Polynomial Evaluation in Optimal Cost 34 4.4.1 Overview of Our Circuit Construction 35 4.4.2 Our Circuit for Polynomial Evaluation 37 4.4.3 Cost Analysis 40 4.5 Cost Optimization 45 4.5.1 Galois Ring over Z_{p^e} and a Sampling Set 45 4.5.2 Optimization of Prover's Cost for Rounding Layers 47 5 Experimental Results 50 5.1 Experimental Setup 50 5.2 Verifiable Rounding Operation 51 5.2.1 Effectiveness of Optimization via Galois Ring 51 5.2.2 Efficiency of Verifiable Rounding Operation 53 5.3 Comparison to Thaler's Refinement of GKR Protocol 54 5.4 Discussion 57 6 Conclusions 60 6.1 Towards Verifiable AI 61 6.2 Verifiable Cryptographic Computation 62 Abstract (in Korean) 74Docto
    corecore