We study a market for private data in which a data analyst publicly releases
a statistic over a database of private information. Individuals that own the
data incur a cost for their loss of privacy proportional to the differential
privacy guarantee given by the analyst at the time of the release. The analyst
incentivizes individuals by compensating them, giving rise to a \emph{privacy
auction}. Motivated by recommender systems, the statistic we consider is a
linear predictor function with publicly known weights. The statistic can be
viewed as a prediction of the unknown data of a new individual, based on the
data of individuals in the database. We formalize the trade-off between privacy
and accuracy in this setting, and show that a simple class of estimates
achieves an order-optimal trade-off. It thus suffices to focus on auction
mechanisms that output such estimates. We use this observation to design a
truthful, individually rational, proportional-purchase mechanism under a fixed
budget constraint. We show that our mechanism is 5-approximate in terms of
accuracy compared to the optimal mechanism, and that no truthful mechanism can
achieve a 2−ε approximation, for any ε>0