7 research outputs found

    Private Mobile Pay-TV From Priced Oblivious Transfer

    Get PDF
    In pay-TV, a service provider offers TV programs and channels to users. To ensure that only authorized users gain access, conditional access systems (CAS) have been proposed. In existing CAS, users disclose to the service provider the TV programs and channels they purchase. We propose a pay-per-view and a pay-per-channel CAS that protect users' privacy. Our pay-per-view CAS employs priced oblivious transfer (POT) to allow a user to purchase TV programs without disclosing which programs were bought to the service provider. In our pay-per-channel CAS, POT is employed together with broadcast attribute-based encryption (BABE) to achieve low storage overhead, collusion resistance, efficient revocation and broadcast efficiency. We propose a new POT scheme and show its feasibility by implementing and testing our CAS on a representative mobile platform

    UC Priced Oblivious Transfer with Purchase Statistics and Dynamic Pricing

    Get PDF
    Priced oblivious transfer (POT) is a cryptographic protocol that can be used to protect customer privacy in e-commerce applications. Namely, it allows a buyer to purchase an item from a seller without disclosing to the latter which item was purchased and at which price. Unfortunately, existing POT schemes have some drawbacks in terms of design and functionality. First, the design of existing POT schemes is not modular. Typically, a POT scheme extends a k-out-of-N oblivious transfer (OT) scheme by adding prices to the items. However, all POT schemes do not use OT as a black-box building block with certain security guarantees. Consequently, security of the OT scheme needs to be reanalyzed while proving security of the POT scheme, and it is not possible to swap the underlying OT scheme with any other OT scheme. Second, existing POT schemes do not allow the seller to obtain any kind of statistics about the buyer's purchases, which hinders customer and sales management. Moreover, the seller is not able to change the prices of items without restarting the protocol from scratch. We propose a POT scheme that addresses the aforementioned drawbacks. We prove the security of our POT in the UC framework. We modify a standard POT functionality to allow the seller to receive aggregate statistics about the buyer's purchases and to change prices dynamically. We present a modular construction for POT that realizes our functionality in the hybrid model. One of the building blocks is an ideal functionality for OT. Therefore, our protocol separates the tasks carried out by the underlying OT scheme from the additional tasks needed by a POT scheme. Thanks to that, our protocol is a good example of modular design and can be instantiated with any secure OT scheme as well as other building blocks without reanalyzing security from scratch

    Optimistic Fair Priced Oblivious Transfer

    No full text
    Priced oblivious transfer (POT) is a two-party protocol between a vendor and a buyer in which the buyer purchases digital goods without the vendor learning what is bought. Although privacy properties are guaranteed, current schemes do not offer fair exchange. A malicious vendor can, e.g., prevent the buyer from retrieving the goods after receiving the payment, and a malicious buyer can also accuse an honest vendor of misbehavior without the vendor being able to prove this untrue. In order to address these problems, we define the concept of optimistic fair priced oblivious transfer and propose a generic construction that extends secure POT schemes to realize this functionality. Our construction, based on verifiably encrypted signatures, employs a neutral adjudicator that is only involved in case of dispute, and shows that disputes can be resolved without the buyer losing her privacy, i.e., the buyer does not need to disclose which digital goods she is interested in. We show that our construction can be instantiated with an existing universally composable POT scheme, and furthermore we propose a novel full-simulation secure POT scheme that is much more efficient. © 2010 Springer-Verlag.status: publishe

    Anonymous Point Collection - Improved Models and Security Definitions

    Get PDF
    This work is a comprehensive, formal treatment of anonymous point collection. The proposed definition does not only provide a strong notion of security and privacy, but also covers features which are important for practical use. An efficient realization is presented and proven to fulfill the proposed definition. The resulting building block is the first one that allows for anonymous two-way transactions, has semi-offline capabilities, yields constant storage size, and is provably secure

    Anonymous Point Collection - Improved Models and Security Definitions

    Get PDF
    This work is a comprehensive, formal treatment of anonymous point collection. The proposed definition does not only provide a strong notion of security and privacy, but also covers features which are important for practical use. An efficient realization is presented and proven to fulfill the proposed definition. The resulting building block is the first one that allows for anonymous two-way transactions, has semi-offline capabilities, yields constant storage size, and is provably secure

    Privacy protection of user profiles in personalized information systems

    Get PDF
    In recent times we are witnessing the emergence of a wide variety of information systems that tailor the information-exchange functionality to meet the specific interests of their users. Most of these personalized information systems capitalize on, or lend themselves to, the construction of profiles, either directly declared by a user, or inferred from past activity. The ability of these systems to profile users is therefore what enables such intelligent functionality, but at the same time, it is the source of serious privacy concerns. Although there exists a broad range of privacy-enhancing technologies aimed to mitigate many of those concerns, the fact is that their use is far from being widespread. The main reason is that there is a certain ambiguity about these technologies and their effectiveness in terms of privacy protection. Besides, since these technologies normally come at the expense of system functionality and utility, it is challenging to assess whether the gain in privacy compensates for the costs in utility. Assessing the privacy provided by a privacy-enhancing technology is thus crucial to determine its overall benefit, to compare its effectiveness with other technologies, and ultimately to optimize it in terms of the privacy-utility trade-off posed. Considerable effort has consequently been devoted to investigating both privacy and utility metrics. However, most of these metrics are specific to concrete systems and adversary models, and hence are difficult to generalize or translate to other contexts. Moreover, in applications involving user profiles, there are a few proposals for the evaluation of privacy, and those existing are not appropriately justified or fail to justify the choice. The first part of this thesis approaches the fundamental problem of quantifying user privacy. Firstly, we present a theoretical framework for privacy-preserving systems, endowed with a unifying view of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. Our theoretical analysis shows that numerous privacy metrics emerging from a broad spectrum of applications are bijectively related to this estimation error, which permits interpreting and comparing these metrics under a common perspective. Secondly, we tackle the issue of measuring privacy in the enthralling application of personalized information systems. Specifically, we propose two information-theoretic quantities as measures of the privacy of user profiles, and justify these metrics by building on Jaynes' rationale behind entropy-maximization methods and fundamental results from the method of types and hypothesis testing. Equipped with quantifiable measures of privacy and utility, the second part of this thesis investigates privacy-enhancing, data-perturbative mechanisms and architectures for two important classes of personalized information systems. In particular, we study the elimination of tags in semantic-Web applications, and the combination of the forgery and the suppression of ratings in personalized recommendation systems. We design such mechanisms to achieve the optimal privacy-utility trade-off, in the sense of maximizing privacy for a desired utility, or vice versa. We proceed in a systematic fashion by drawing upon the methodology of multiobjective optimization. Our theoretical analysis finds a closed-form solution to the problem of optimal tag suppression, and to the problem of optimal forgery and suppression of ratings. In addition, we provide an extensive theoretical characterization of the trade-off between the contrasting aspects of privacy and utility. Experimental results in real-world applications show the effectiveness of our mechanisms in terms of privacy protection, system functionality and data utility
    corecore