843 research outputs found

    Redrawing the Boundaries on Purchasing Data from Privacy-Sensitive Individuals

    Full text link
    We prove new positive and negative results concerning the existence of truthful and individually rational mechanisms for purchasing private data from individuals with unbounded and sensitive privacy preferences. We strengthen the impossibility results of Ghosh and Roth (EC 2011) by extending it to a much wider class of privacy valuations. In particular, these include privacy valuations that are based on ({\epsilon}, {\delta})-differentially private mechanisms for non-zero {\delta}, ones where the privacy costs are measured in a per-database manner (rather than taking the worst case), and ones that do not depend on the payments made to players (which might not be observable to an adversary). To bypass this impossibility result, we study a natural special setting where individuals have mono- tonic privacy valuations, which captures common contexts where certain values for private data are expected to lead to higher valuations for privacy (e.g. having a particular disease). We give new mech- anisms that are individually rational for all players with monotonic privacy valuations, truthful for all players whose privacy valuations are not too large, and accurate if there are not too many players with too-large privacy valuations. We also prove matching lower bounds showing that in some respects our mechanism cannot be improved significantly

    The Strange Case of Privacy in Equilibrium Models

    Get PDF
    We study how privacy technologies affect user and advertiser behavior in a simple economic model of targeted advertising. In our model, a consumer first decides whether or not to buy a good, and then an advertiser chooses an advertisement to show the consumer. The consumer's value for the good is correlated with her type, which determines which ad the advertiser would prefer to show to her---and hence, the advertiser would like to use information about the consumer's purchase decision to target the ad that he shows. In our model, the advertiser is given only a differentially private signal about the consumer's behavior---which can range from no signal at all to a perfect signal, as we vary the differential privacy parameter. This allows us to study equilibrium behavior as a function of the level of privacy provided to the consumer. We show that this behavior can be highly counter-intuitive, and that the effect of adding privacy in equilibrium can be completely different from what we would expect if we ignored equilibrium incentives. Specifically, we show that increasing the level of privacy can actually increase the amount of information about the consumer's type contained in the signal the advertiser receives, lead to decreased utility for the consumer, and increased profit for the advertiser, and that generally these quantities can be non-monotonic and even discontinuous in the privacy level of the signal

    Buying Private Data without Verification

    Get PDF
    We consider the problem of designing a survey to aggregate non-verifiable information from a privacy-sensitive population: an analyst wants to compute some aggregate statistic from the private bits held by each member of a population, but cannot verify the correctness of the bits reported by participants in his survey. Individuals in the population are strategic agents with a cost for privacy, \ie, they not only account for the payments they expect to receive from the mechanism, but also their privacy costs from any information revealed about them by the mechanism's outcome---the computed statistic as well as the payments---to determine their utilities. How can the analyst design payments to obtain an accurate estimate of the population statistic when individuals strategically decide both whether to participate and whether to truthfully report their sensitive information? We design a differentially private peer-prediction mechanism that supports accurate estimation of the population statistic as a Bayes-Nash equilibrium in settings where agents have explicit preferences for privacy. The mechanism requires knowledge of the marginal prior distribution on bits bib_i, but does not need full knowledge of the marginal distribution on the costs cic_i, instead requiring only an approximate upper bound. Our mechanism guarantees ϵ\epsilon-differential privacy to each agent ii against any adversary who can observe the statistical estimate output by the mechanism, as well as the payments made to the n1n-1 other agents jij\neq i. Finally, we show that with slightly more structured assumptions on the privacy cost functions of each agent, the cost of running the survey goes to 00 as the number of agents diverges.Comment: Appears in EC 201

    Designing a personal information transaction object

    Full text link
    © 2016 IEEE. As mobile and wearable technologies grow in popularity, ever-increasing volumes of valuable, fine-grained personal information are generated as people go about their daily lives. This information may be exchanged by individuals for "free" services, but there is currently no widely adopted means by which individuals can benefit financially from their personal information. To address this problem we consider a Primary Personal Information Market (PPIM) - a market on which individuals can be financially compensated in exchange for access to their personal information. We draw on Design Science and Market Engineering to justify design choices for a permissions-based Personal Information Transaction Object (PITO), a commodity which could be successfully traded on a Primary Personal Information Market

    Designing a Primary Personal Information Market as an Industry Platform: a Service Innovation Approach

    Get PDF
    It is well recognised that personal data have intrinsic value to B2C companies. However, there are no widely adopted means by which individuals can benefit financially from the personal data they generate. Furthermore, there is a substantial lack of empirical research on markets for online personal data. Nevertheless, prior work has shown that a Primary Personal Information Market (PPIM) is a viable solution to the problem of monetising personal data. This paper explores how a PPIM could be conceptualised and designed as an Industry Platform. Using an integrated Service Innovation Method (iSIM) we incorporate into our design a multi-sided personal information business model to facilitate commercialisation. An initial prototype is developed and its utility from a data product consumer’s perspective is evaluated using semi-structured interviews with industry practitioners. We find that a PPIM conceptualised as an industry platform has significant commercial appeal and that it resolves a number of objections raised in response to previous designs

    Optimal Data Acquisition for Statistical Estimation

    Get PDF
    We consider a data analyst's problem of purchasing data from strategic agents to compute an unbiased estimate of a statistic of interest. Agents incur private costs to reveal their data and the costs can be arbitrarily correlated with their data. Once revealed, data are verifiable. This paper focuses on linear unbiased estimators. We design an individually rational and incentive compatible mechanism that optimizes the worst-case mean-squared error of the estimation, where the worst-case is over the unknown correlation between costs and data, subject to a budget constraint in expectation. We characterize the form of the optimal mechanism in closed-form. We further extend our results to acquiring data for estimating a parameter in regression analysis, where private costs can correlate with the values of the dependent variable but not with the values of the independent variables
    corecore