2,334 research outputs found

    SoK: Privacy Preserving Machine Learning using Functional Encryption: Opportunities and Challenges

    Full text link
    With the advent of functional encryption, new possibilities for computation on encrypted data have arisen. Functional Encryption enables data owners to grant third-party access to perform specified computations without disclosing their inputs. It also provides computation results in plain, unlike Fully Homomorphic Encryption. The ubiquitousness of machine learning has led to the collection of massive private data in the cloud computing environment. This raises potential privacy issues and the need for more private and secure computing solutions. Numerous efforts have been made in privacy-preserving machine learning (PPML) to address security and privacy concerns. There are approaches based on fully homomorphic encryption (FHE), secure multiparty computation (SMC), and, more recently, functional encryption (FE). However, FE-based PPML is still in its infancy and has not yet gotten much attention compared to FHE-based PPML approaches. In this paper, we provide a systematization of PPML works based on FE summarizing state-of-the-art in the literature. We focus on Inner-product-FE and Quadratic-FE-based machine learning models for the PPML applications. We analyze the performance and usability of the available FE libraries and their applications to PPML. We also discuss potential directions for FE-based PPML approaches. To the best of our knowledge, this is the first work to systematize FE-based PPML approaches

    Functional encryption based approaches for practical privacy-preserving machine learning

    Get PDF
    Machine learning (ML) is increasingly being used in a wide variety of application domains. However, deploying ML solutions poses a significant challenge because of increasing privacy concerns, and requirements imposed by privacy-related regulations. To tackle serious privacy concerns in ML-based applications, significant recent research efforts have focused on developing privacy-preserving ML (PPML) approaches by integrating into ML pipeline existing anonymization mechanisms or emerging privacy protection approaches such as differential privacy, secure computation, and other architectural frameworks. While promising, existing secure computation based approaches, however, have significant computational efficiency issues and hence, are not practical. In this dissertation, we address several challenges related to PPML and propose practical secure computation based approaches to solve them. We consider both two-tier cloud-based and three-tier hybrid cloud-edge based PPML architectures and address both emerging deep learning models and federated learning approaches. The proposed approaches enable us to outsource data or update a locally trained model in a privacy-preserving manner by employing computation over encrypted datasets or local models. Our proposed secure computation solutions are based on functional encryption (FE) techniques. Evaluation of the proposed approaches shows that they are efficient and more practical than existing approaches, and provide strong privacy guarantees. We also address issues related to the trustworthiness of various entities within the proposed PPML infrastructures. This includes a third-party authority (TPA) which plays a critical role in the proposed FE-based PPML solutions, and cloud service providers. To ensure that such entities can be trusted, we propose a transparency and accountability framework using blockchain. We show that the proposed transparency framework is effective and guarantees security properties. Experimental evaluation shows that the proposed framework is efficient

    Privacy-Preserving Credit Scoring viaย Functional Encryption

    Get PDF
    The majority of financial organizations managing confidential data are aware of security threats and leverage widely accepted solutions (e.g., storage encryption, transport-level encryption, intrusion detection systems) to prevent or detect attacks. Yet these hardening measures do little to face even worse threats posed on data-in-use. Solutions such as Homomorphic Encryption (HE) and hardware-assisted Trusted Execution Environment (TEE) are nowadays among the preferred approaches for mitigating this type of threats. However, given the high-performance overhead of HE, financial institutions โ€”whose processing rate requirements are stringentโ€” are more oriented towards TEE-based solutions. The X-Margin Inc. company, for example, offers secure financial computations by combining the Intel SGX TEE technology and HE-based Zero-Knowledge Proofs, which shield customersโ€™ data-in-use even against malicious insiders, i.e., users having privileged access to the system. Despite such a solution offers strong security guarantees, it is constrained by having to trust Intel and by the SGX hardware extension availability. In this paper, we evaluate a new frontier for X-Margin, i.e., performing privacy-preserving credit risk scoring via an emerging cryptographic scheme: Functional Encryption (FE), which allows a user to only learn a function of the encrypted data. We describe how the X-Margin application can benefit from this innovative approach and โ€”most importantlyโ€” evaluate its performance impact

    Functional Encryption์„ ์ด์šฉํ•œ ํ”„๋ผ์ด๋ฒ„์‹œ ๋ณดํ˜ธ ์˜จ๋ผ์ธ ํƒ€๊ฒŸ ๊ด‘๊ณ  ์‹œ์Šคํ…œ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ์ปดํ“จํ„ฐ๊ณตํ•™๋ถ€, 2023. 2. ๊ถŒํƒœ๊ฒฝ.As interest in protecting user privacy began to surge, the online advertising industry, a multi-billion market, is also facing the same challenge. Currently, online ads are delivered through real-time bidding (RTB) and behavioral targeting of users. This is done by tracking users across websites to infer their interests and preferences and then used when selecting ads to present to the user. The user profile sent in the ad request contains data that infringes on user privacy and is delivered to various RTB ecosystem actors, not to mention the data stored by the bidders to increase their performance and profitability. I propose a framework named FAdE to preserve user privacy while enabling behavioral targeting and supporting the current RTB ecosystem by introducing minimal changes in the protocols and data structure. My design leverages the functional encryption (FE) scheme to preserve the user's privacy in behavioral targeted advertising. Specifically, I introduce a trusted third party (TTP) who is the key generator in my FE scheme. The user's profile originally used for behavioral targeting is now encrypted and cannot be decrypted by the participants of the RTB ecosystem. However, the demand-side platforms (DSPs) can submit their functions to the TTP and receive function keys. This function derives a metric, a user score, based on the user profile that can be used in their bidding algorithm. Decrypting the encrypted user profiles with the function keys results in the function's output with the user profile as its input. As a result, the user's privacy is preserved within the RTB ecosystem, while DSPs can still submit their bids through behavioral targeting. My evaluation showed that when using a user profile bit vector of length 2,000, it took less than 20ms to decrypt the encrypted user profile and derive the user score metric through the inner-product function. This is much smaller than my criteria of 50ms, which is based on the typical bidding timeframe (100โ€“1,000ms) used in the ad industry. Moreover, my result is smaller than the state-of-the-art privacy-preserving proposals using homomorphic encryption or multi-party computations. To demonstrate the potential for real-world deployment., I build a prototype implementation of my design that consists of a publisher's website, an ad exchange (ADX), the DSP, and the TTP.์ตœ๊ทผ ์‚ฌ์šฉ์ž ๊ฐœ์ธ ์ •๋ณด ๋ณดํ˜ธ์— ๋Œ€ํ•œ ๊ด€์‹ฌ์ด ๊ธ‰์ฆํ•˜๋ฉด์„œ ์ˆ˜์‹ญ์–ต ๊ทœ๋ชจ์˜ ์‹œ์žฅ์ธ ์˜จ๋ผ์ธ ๊ด‘๊ณ  ์‚ฐ์—…๋„ ๊ฐ™์€ ๋ฌธ์ œ์— ์ง๋ฉดํ•ด ์žˆ๋‹ค. ํ˜„์žฌ์˜ ์˜จ๋ผ์ธ ๊ด‘๊ณ ๋Š” Real-time Bidding (RTB)๊ณผ ์‚ฌ์šฉ์ž ํƒ€๊นƒ ๊ด‘๊ณ  (targeted advertising)๋กœ ๋Œ€ํ‘œ๋œ๋‹ค. ์ด๋Š” ์›น์‚ฌ์ดํŠธ์—์„œ ์‚ฌ์šฉ์ž์˜ ์ •๋ณด๋ฅผ ๋ฐ”ํƒ•์œผ๋กœ ๊ด€์‹ฌ๊ณผ ์„ ํ˜ธ๋„๋ฅผ ์ถ”์ •ํ•˜๊ณ  ์ด๋ฅผ ์ด์šฉํ•ด ์‚ฌ์šฉ์ž์—๊ฒŒ ํ‘œ์‹œํ•  ์ ์ ˆํ•œ ๊ด‘๊ณ ๋ฅผ ์ž…์ฐฐ, ์„ ํƒํ•˜๋Š” ๋ฐฉ์‹์ด๋‹ค. ๊ด‘๊ณ  ์š”์ฒญ์„ ์œ„ํ•ด ์ „์†ก๋˜๋Š” user profile์—๋Š” ์‚ฌ์šฉ์ž์˜ ๊ฐœ์ธ ์ •๋ณด๋ฅผ ์นจํ•ดํ•˜๋Š” ๋ฐ์ดํ„ฐ๊ฐ€ ํฌํ•จ๋˜์–ด ์žˆ์œผ๋ฉฐ, RTB ์ƒํƒœ๊ณ„์˜ ์—ฌ๋Ÿฌ ์ฐธ์—ฌ์ž์—๊ฒŒ ์žˆ๋Š” ๊ทธ๋Œ€๋กœ ์ „๋‹ฌ๋˜๋Š” ๋ฌธ์ œ์ ์ด ์žˆ๋‹ค. ๋ณธ ์—ฐ๊ตฌ๋Š” ์‚ฌ์šฉ์ž์˜ ๊ฐœ์ธ ์ •๋ณด๋ฅผ ๋ณดํ˜ธํ•˜๋Š” ๋™์‹œ์— ๊ธฐ์กด์˜ ํ”„๋กœํ† ์ฝœ ๋ฐ ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ์—๋Š” ์ตœ์†Œํ•œ์˜ ๋ณ€๊ฒฝ์„ ๋„์ž…ํ•จ์œผ๋กœ์จ ํ˜„์žฌ์˜ RTB ์ƒํƒœ๊ณ„์—์„œ ๊ณ„์†ํ•ด์„œ ํƒ€๊นƒ ๊ด‘๊ณ ๊ฐ€ ๊ฐ€๋Šฅํ•˜๋„๋ก ์ง€์›ํ•˜๋Š” FAdE๋ฅผ ์ œ์•ˆํ•œ๋‹ค. ์ œ์•ˆํ•˜๋Š” ๋””์ž์ธ์€ Functional Encryption (FE)๊ณผ ๊ทธ key ์ƒ์„ฑ์ž์ธ Trusted Third Party (TTP)์˜ ๋„์ž…์„ ํ†ตํ•ด ๊ฐœ์ธ์ •๋ณด ๋ณดํ˜ธ๊ฐ€ ๊ฐ€๋Šฅํ•œ ํƒ€๊นƒ ๊ด‘๊ณ ๋ฅผ ์ œ๊ณตํ•œ๋‹ค. ๋ณธ ๋””์ž์ธ์—์„œ๋Š”, ๊ธฐ์กด ํƒ€๊นƒ ๊ด‘๊ณ ๋ฅผ ์œ„ํ•ด ์‚ฌ์šฉ๋˜๋˜ user profile์„ ์•”ํ˜ธํ™”(encrypt)ํ•˜์—ฌ ์ „๋‹ฌํ•˜๋ฏ€๋กœ ๋‹ค๋ฅธ RTB ํ™˜๊ฒฝ์˜ ์ฐธ์—ฌ์ž๊ฐ€ ํ•ด๋…(decrypt)ํ•  ์ˆ˜ ์—†๋‹ค. Demand Side Platform (DSP)์€ ๊ด‘๊ณ  ์š”์ฒญ์— ๋Œ€ํ•œ ์ž…์ฐฐ ์—ฌ๋ถ€์™€ ์ž…์ฐฐ๊ฐ€๊ฒฉ์„ ๊ฒฐ์ •ํ•˜๊ธฐ ์œ„ํ•ด ์•”ํ˜ธํ™”๋œ ์œ ์ € ๋ฐ์ดํ„ฐ(encrypted user data, ciphertext)๋ฅผ ์‚ฌ์šฉํ•œ๋‹ค. DSP๋Š” ์‚ฌ์ „์— ์‚ฌ์šฉ์ž์˜ ์ ์ˆ˜๋ฅผ ์—ฐ์‚ฐํ•˜๊ธฐ ์œ„ํ•œ function์„ ์ž‘์„ฑํ•˜๊ณ  ์ด๋ฅผ TTP์— ์ œ์ถœํ•˜์—ฌ function key๋ฅผ ํš๋“ํ•œ๋‹ค. ์ด function key๋ฅผ ์ด์šฉํ•ด ์•”ํ˜ธํ™”๋œ ์œ ์ € ๋ฐ์ดํ„ฐ๋ฅผ ํ•ด๋…(decrypt) ํ•˜๋ฉด DSP์˜ ๋‚ด๋ถ€ ์ž…์ฐฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜์— ๋ฉ”ํŠธ๋ฆญ(metric)์œผ๋กœ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ๋Š” user score๋ฅผ ์–ป๊ฒŒ ๋˜๊ณ  ์ด๋ฅผ ์ž…์ฐฐ ๊ฒฐ์ •์— ํ™œ์šฉํ•˜๊ฒŒ ๋œ๋‹ค. ๊ฒฐ๊ณผ์ ์œผ๋กœ RTB ํ™˜๊ฒฝ ๋‚ด์—์„œ ์‚ฌ์šฉ์ž์˜ ๊ฐœ์ธ์ •๋ณด๋Š” ๋ณดํ˜ธํ•˜๋ฉด์„œ DSP๋Š” ์‚ฌ์šฉ์ž์˜ ์ˆจ๊ฒจ์ง„ ์ •๋ณด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํƒ€๊นƒ ๊ด‘๊ณ  ์ž…์ฐฐ์— ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ๋‹ค. ๋งˆ์ง€๋ง‰์œผ๋กœ, FAdE ๋””์ž์ธ์˜ ์‹ค์ œ ํ™œ์šฉ ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•œ ๋ถ„์„์„ ์ง„ํ–‰ํ•œ๋‹ค. user profile์€ ์ถฉ๋ถ„ํ•œ ๊ธธ์ด๋กœ ํ™•์ธ๋œ 2,000 ๊ธธ์ด์˜ 0๊ณผ 1๋กœ ์ด๋ฃจ์–ด์ง„ ๋ฒกํ„ฐ (bit vector) ํ˜•ํƒœ๋กœ ์ƒ์„ฑํ•œ๋‹ค. ์ด user profile vector๋ฅผ FE๋กœ ์•”ํ˜ธํ™”(encrypt)ํ•œ ํ›„, weight vector์— ํ•ด๋‹นํ•˜๋Š” ์ž„์˜์˜ function๊ณผ ๋ฒกํ„ฐ ๋‚ด์ (Inner product) ์—ฐ์‚ฐ์— ์†Œ์š”๋˜๋Š” ์‹œ๊ฐ„์„ ์ธก์ •ํ•˜์˜€์„ ๋•Œ, user score๋ฅผ ๋„์ถœํ•˜๋Š” ๋ฐ 20ms ๋ฏธ๋งŒ์ด ์†Œ์š”๋˜๋Š” ๊ฒƒ์„ ํ™•์ธํ•œ๋‹ค. ์ด๋Š” ๊ด‘๊ณ  ์—…๊ณ„์—์„œ ์ผ๋ฐ˜์ ์œผ๋กœ ์‚ฌ์šฉ๋˜๋Š” ์ž…์ฐฐ ์ œํ•œ ์‹œ๊ฐ„(100-1,000ms)์„ ๋ฐ”ํƒ•์œผ๋กœ ์ •์˜ํ•œ ๋ณธ ์—ฐ๊ตฌ์˜ ์ž์ฒด ๊ธฐ์ค€ 50ms ๋ณด๋‹ค ์ถฉ๋ถ„ํžˆ ์ž‘์€ ๊ฐ’์— ํ•ด๋‹นํ•œ๋‹ค. ์ด ๊ฒฐ๊ณผ๋Š” ๋™ํ˜• ์•”ํ˜ธํ™”(Homomorphic Encryption) ๋˜๋Š” Multi-Party Computation(MPC) ๋“ฑ์„ ์‚ฌ์šฉํ•˜๋Š” ์˜จ๋ผ์ธ ๊ด‘๊ณ ์—์„œ์˜ ๋‹ค๋ฅธ ๊ฐœ์ธ์ •๋ณด ๋ณดํ˜ธ ์ œ์•ˆ๋ณด๋‹ค ์„ฑ๋Šฅ ์ƒ์˜ ์ด์ ์„ ๊ฐ–๋Š”๋‹ค. ๋˜ํ•œ ์ œ์•ˆ ๋””์ž์ธ์„ ํ™œ์šฉํ•ด ํƒ€๊นƒ๊ด‘๊ณ ๊ฐ€ ์‹ค์ œ๋กœ ๊ฐ€๋Šฅํ•จ์„ ํ™•์ธํ•˜๊ธฐ ์œ„ํ•ด Publisher ์›น์‚ฌ์ดํŠธ, Ad Exchange(ADX), 3๊ฐœ์˜ DSP ๊ทธ๋ฆฌ๊ณ  TTP๋กœ ๊ตฌ์„ฑ๋œ ์ œ์•ˆ ๋””์ž์ธ์˜ ํ”„๋กœํ† ํƒ€์ž… ๊ตฌํ˜„์„ ์ œ์‹œํ•œ๋‹ค. ๋ณธ ์—ฐ๊ตฌ์—์„œ ์ œ์•ˆ๋œ FAdE๋ฅผ ํ†ตํ•ด ์‚ฌ์šฉ์ž์˜ ๊ฐœ์ธ ์ •๋ณด๋Š” ๋ณดํ˜ธํ•˜๋ฉด์„œ ๊ธฐ์กด๊ณผ ๊ฐ™์€ ์ˆ˜์ค€์˜ ํƒ€๊นƒ ๊ด‘๊ณ ๊ฐ€ ๊ฐ€๋Šฅํ•˜๊ณ , ์ด๋ฅผ ์ˆ˜์šฉ ๊ฐ€๋Šฅํ•œ ์ˆ˜์ค€์˜ ์ ์€ ์˜ค๋ฒ„ํ—ค๋“œ๋กœ ์ ์šฉ์ด ๊ฐ€๋Šฅํ•˜์˜€์Œ์„ ํ™•์ธํ•˜์˜€๋‹ค. ์—ฐ๊ตฌ์˜ ๊ฒฐ๊ณผ๊ฐ€ ํ–ฅํ›„ ์‹ค์ œ ์˜จ๋ผ์ธ ๊ด‘๊ณ  ์ƒํƒœ๊ณ„์—์„œ ์‚ฌ์šฉ์ž์˜ ํ”„๋ผ์ด๋ฒ„์‹œ ๋ณดํ˜ธ์— ๊ธฐ์—ฌํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์œผ๋กœ ๊ธฐ๋Œ€ํ•œ๋‹ค.Chapter 1 Introduction 1 Chapter 2 Background 5 2.1 Online Advertising 5 2.1.1 RTB Ecosystem 6 2.1.2 OpenRTB 8 2.2 Functional Encryption 9 2.2.1 Overview of FE 10 2.2.2 Difference between FE and FHE 11 2.2.3 Information Leakage in Functional Encryption 12 2.2.4 Inner Product Functional Encryption (IPFE) 13 Chapter 3 Design 14 3.1 The approach to preserving privacy 15 3.1.1 Encrypted user profile using FE 15 3.2 Setup phase 18 3.2.1 TTP 18 3.2.2 User Browser 18 3.2.3 DSP 19 3.3 Bidding Phase 20 3.3.1 Browser (User) 21 3.3.2 DSP 21 Chapter 4 Evaluation 24 4.1 Criteria 24 4.1.1 Time 24 4.1.2 File size 25 4.2 Environment 26 4.2.1 Testbed 26 4.2.2 FE Library 26 4.3 Result 26 4.3.1 FAdE design 26 4.3.2 Extra test 30 4.4 Prototyping 33 Chapter 5 Related work 36 Chapter 6 Conculsion 40 Appendix A 48 A.1 Bid Request Sample (OpenRTB 2.5) 48 A.2 Functional Encryption Algorithm 50 ๊ตญ๋ฌธ์ดˆ๋ก 53์„

    Conditionals in Homomorphic Encryption and Machine Learning Applications

    Get PDF
    Homomorphic encryption aims at allowing computations on encrypted data without decryption other than that of the final result. This could provide an elegant solution to the issue of privacy preservation in data-based applications, such as those using machine learning, but several open issues hamper this plan. In this work we assess the possibility for homomorphic encryption to fully implement its program without relying on other techniques, such as multiparty computation (SMPC), which may be impossible in many use cases (for instance due to the high level of communication required). We proceed in two steps: i) on the basis of the structured program theorem (Bohm-Jacopini theorem) we identify the relevant minimal set of operations homomorphic encryption must be able to perform to implement any algorithm; and ii) we analyse the possibility to solve -- and propose an implementation for -- the most fundamentally relevant issue as it emerges from our analysis, that is, the implementation of conditionals (requiring comparison and selection/jump operations). We show how this issue clashes with the fundamental requirements of homomorphic encryption and could represent a drawback for its use as a complete solution for privacy preservation in data-based applications, in particular machine learning ones. Our approach for comparisons is novel and entirely embedded in homomorphic encryption, while previous studies relied on other techniques, such as SMPC, demanding high level of communication among parties, and decryption of intermediate results from data-owners. Our protocol is also provably safe (sharing the same safety as the homomorphic encryption schemes), differently from other techniques such as Order-Preserving/Revealing-Encryption (OPE/ORE).Comment: 14 pages, 1 figure, corrected typos, added introductory pedagogical section on polynomial approximatio
    • โ€ฆ
    corecore