17 research outputs found

    Semi-Trusted Mixer Based Privacy Preserving Distributed Data Mining for Resource Constrained Devices

    Get PDF
    In this paper a homomorphic privacy preserving association rule mining algorithm is proposed which can be deployed in resource constrained devices (RCD). Privacy preserved exchange of counts of itemsets among distributed mining sites is a vital part in association rule mining process. Existing cryptography based privacy preserving solutions consume lot of computation due to complex mathematical equations involved. Therefore less computation involved privacy solutions are extremely necessary to deploy mining applications in RCD. In this algorithm, a semi-trusted mixer is used to unify the counts of itemsets encrypted by all mining sites without revealing individual values. The proposed algorithm is built on with a well known communication efficient association rule mining algorithm named count distribution (CD). Security proofs along with performance analysis and comparison show the well acceptability and effectiveness of the proposed algorithm. Efficient and straightforward privacy model and satisfactory performance of the protocol promote itself among one of the initiatives in deploying data mining application in RCD.Comment: IEEE Publication format, International Journal of Computer Science and Information Security, IJCSIS, Vol. 8 No. 1, April 2010, USA. ISSN 1947 5500, http://sites.google.com/site/ijcsis

    CROSS SPACE OPINION ARRANGEMENT UTILIZING ASSUMPTION TOUCHY INSERTING'S IN INFORMATION MINING

    Get PDF
    The process of firewall is expanding in a flash in largeness with perilous aggrandization approximately services reason upon Internet accordingly, optimizing the policies of firewall foreshadow for overhaul web show. Scheming of a burden that allow neighbouring firewalls austerely allow the division inter-firewall concerning anyone destitute review firewall is a censorious refer to. Without escaping the sheathe practically code of remote undistinguished, look effort on mingled firewall indoors violent curbed troublesome full the policies of firewall over coded tube of sober free webs. Within practical sheltered nets concerns of on the docket direction more in earth-firewall increase of intimate-preserving are contradictory. With pertinent of firewalls alter obsolete network lamed unveil the playbill set about opposed to everyone counter to authorizing firewalls examination adapt needless network not beyond a custom of privacy conserving. The distance any was institute concerns to firewalls of palatable and stateless

    Privacy-preserving Cross-domain Routing Optimization -- A Cryptographic Approach

    Full text link
    Today's large-scale enterprise networks, data center networks, and wide area networks can be decomposed into multiple administrative or geographical domains. Domains may be owned by different administrative units or organizations. Hence protecting domain information is an important concern. Existing general-purpose Secure Multi-Party Computation (SMPC) methods that preserves privacy for domains are extremely slow for cross-domain routing problems. In this paper we present PYCRO, a cryptographic protocol specifically designed for privacy-preserving cross-domain routing optimization in Software Defined Networking (SDN) environments. PYCRO provides two fundamental routing functions, policy-compliant shortest path computing and bandwidth allocation, while ensuring strong protection for the private information of domains. We rigorously prove the privacy guarantee of our protocol. We have implemented a prototype system that runs PYCRO on servers in a campus network. Experimental results using real ISP network topologies show that PYCRO is very efficient in computation and communication costs

    The misty crystal ball: Efficient concealment of privacy-sensitive attributes in predictive analytics

    Get PDF
    Individuals are becoming increasingly concerned with privacy. This curtails their willingness to share sensitive attributes like age, gender or personal preferences; yet firms largely rely upon customer data in any type of predictive analytics. Hence, organizations are confronted with a dilemma in which they need to make a tradeoff between a sparse use of data and the utility from better predictive analytics. This paper proposes a masking mechanism that obscures sensitive attributes while maintaining a large degree of predictive power. More precisely, we efficiently identify data partitions that are best suited for (i) shuffling, (ii) swapping and, as a form of randomization, (iii) perturbing attributes by conditional replacement. By operating on data partitions that are derived from a predictive algorithm, we achieve the objective of masking privacy-sensitive attributes with marginal downsides for predictive modeling. The resulting trade-off between masking and predictive utility is empirically evaluated in the context of customer churn where, for instance, a stratified shuffling of attribute values impedes predictive accuracy rarely by more than a percentage point. Our proposed framework entails direct managerial implications as a growing share of firms adopts predictive analytics and thus requires mechanisms that better adhere to user demands for information privacy

    Secure equality testing protocols in the two-party setting

    Get PDF
    Protocols for securely testing the equality of two encrypted integers are common building blocks for a number of proposals in the literature that aim for privacy preservation. Being used repeatedly in many cryptographic protocols, designing efficient equality testing protocols is important in terms of computation and communication overhead. In this work, we consider a scenario with two parties where party A has two integers encrypted using an additively homomorphic scheme and party B has the decryption key. Party A would like to obtain an encrypted bit that shows whether the integers are equal or not but nothing more. We propose three secure equality testing protocols, which are more efficient in terms of communication, computation or both compared to the existing work. To support our claims, we present experimental results, which show that our protocols achieve up to 99% computation-wise improvement compared to the state-of-the-art protocols in a fair experimental set-up

    Privacy preserving distributed optimization using homomorphic encryption

    Full text link
    This paper studies how a system operator and a set of agents securely execute a distributed projected gradient-based algorithm. In particular, each participant holds a set of problem coefficients and/or states whose values are private to the data owner. The concerned problem raises two questions: how to securely compute given functions; and which functions should be computed in the first place. For the first question, by using the techniques of homomorphic encryption, we propose novel algorithms which can achieve secure multiparty computation with perfect correctness. For the second question, we identify a class of functions which can be securely computed. The correctness and computational efficiency of the proposed algorithms are verified by two case studies of power systems, one on a demand response problem and the other on an optimal power flow problem.Comment: 24 pages, 5 figures, journa

    Privacy-Preserving Aggregation of Time-Series Data

    Get PDF
    The conference paper can be viewed at: http://www.isoc.org/isoc/conferences/ndss/11/proceedings.shtmlSession 9: PrivacyWe consider how an untrusted data aggregator can learn desired statistics over multiple participants’ data, without compromising each individual’s privacy. We propose a construction that allows a group of participants to periodically upload encrypted values to a data aggregator, such that the aggregator is able to compute the sum of all participants’ values in every time period, but is unable to learn anything else. We achieve strong privacy guarantees using two main techniques. First, we show how to utilize applied cryptographic techniques to allow the aggregator to decrypt the sum from multiple ciphertexts encrypted under different user keys. Second, we describe a distributed data randomization procedure that guarantees the differential privacy of the outcome statistic, even when a subset of participants might be compromised.published_or_final_versio

    CryptDB: A Practical Encrypted Relational DBMS

    Get PDF
    CryptDB is a DBMS that provides provable and practical privacy in the face of a compromised database server or curious database administrators. CryptDB works by executing SQL queries over encrypted data. At its core are three novel ideas: an SQL-aware encryption strategy that maps SQL operations to encryption schemes, adjustable query-based encryption which allows CryptDB to adjust the encryption level of each data item based on user queries, and onion encryption to efficiently change data encryption levels. CryptDB only empowers the server to execute queries that the users requested, and achieves maximum privacy given the mix of queries issued by the users. The database server fully evaluates queries on encrypted data and sends the result back to the client for final decryption; client machines do not perform any query processing and client-side applications run unchanged. Our evaluation shows that CryptDB has modest overhead: on the TPC-C benchmark on Postgres, CryptDB reduces throughput by 27% compared to regular Postgres. Importantly, CryptDB does not change the innards of existing DBMSs: we realized the implementation of CryptDB using client-side query rewriting/encrypting, user-defined functions, and server-side tables for public key information. As such, CryptDB is portable; porting CryptDB to MySQL required changing 86 lines of code, mostly at the connectivity layer

    Privacy-Preserving Aggregation of Time-Series Data

    Get PDF
    The conference paper can be viewed at: http://www.isoc.org/isoc/conferences/ndss/11/proceedings.shtmlSession 9: PrivacyWe consider how an untrusted data aggregator can learn desired statistics over multiple participants’ data, without compromising each individual’s privacy. We propose a construction that allows a group of participants to periodically upload encrypted values to a data aggregator, such that the aggregator is able to compute the sum of all participants’ values in every time period, but is unable to learn anything else. We achieve strong privacy guarantees using two main techniques. First, we show how to utilize applied cryptographic techniques to allow the aggregator to decrypt the sum from multiple ciphertexts encrypted under different user keys. Second, we describe a distributed data randomization procedure that guarantees the differential privacy of the outcome statistic, even when a subset of participants might be compromised.published_or_final_versio
    corecore