509 research outputs found
Anonymous and Publicly Linkable Reputation Systems
We consider reputation systems where users are allowed to rate products that they purchased previously. To obtain trustworthy reputations, they are allowed to rate these products only once. As long as they do so, the users stay anonymous. Everybody is able to detect users deviating from the rate-products-only-once policy and the anonymity of such dishonest users can be revoked by a system manager. In this paper we present formal models for such reputation systems and their security. Based on group signatures we design an efficient reputation system that meets all our requirements
Practical, Anonymous, and Publicly Linkable Universally-Composable Reputation Systems
We consider reputation systems in the Universal Composability Framework where users can anonymously rate each others products that they purchased previously. To obtain trustworthy, reliable, and honest ratings, users are allowed to rate products only once. Everybody is
able to detect users that rate products multiple times. In this paper we present an ideal functionality for such reputation systems and give an efficient realization that is usable in practical applications
Anonymity and Rewards in Peer Rating Systems
When peers rate each other, they may choose to rate inaccurately in order to boost their own reputation or unfairly lower anotherâs. This could be successfully mitigated by having a reputation server incentivise accurate ratings with a reward. However, assigning rewards becomes a challenge when ratings are anonymous, since the reputation server cannot tell which peers to reward for rating accurately. To address this, we propose an anonymous peer rating system in which users can be rewarded for accurate ratings, and we formally define its model and security requirements. In our system ratings are rewarded in batches, so that users claiming their rewards only reveal they authored one in this batch of ratings. To ensure the anonymity set of rewarded users is not reduced, we also split the reputation server into two entities, the Rewarder, who knows which ratings are rewarded, and the Reputation Holder, who knows which users were rewarded. We give a provably secure construction satisfying all the security properties required. For our construction we use a modification of a Direct Anonymous Attestation scheme to ensure that peers can prove their own reputation when rating others, and that multiple feedback on the same subject can be detected. We then use Linkable Ring Signatures to enable peers to be rewarded for their accurate ratings, while still ensuring that ratings are anonymous. Our work results in a system which allows for accurate ratings to be rewarded, whilst still providing anonymity of ratings with respect to the central entities managing the system
Hang With Your Buddies to Resist Intersection Attacks
Some anonymity schemes might in principle protect users from pervasive
network surveillance - but only if all messages are independent and unlinkable.
Users in practice often need pseudonymity - sending messages intentionally
linkable to each other but not to the sender - but pseudonymity in dynamic
networks exposes users to intersection attacks. We present Buddies, the first
systematic design for intersection attack resistance in practical anonymity
systems. Buddies groups users dynamically into buddy sets, controlling message
transmission to make buddies within a set behaviorally indistinguishable under
traffic analysis. To manage the inevitable tradeoffs between anonymity
guarantees and communication responsiveness, Buddies enables users to select
independent attack mitigation policies for each pseudonym. Using trace-based
simulations and a working prototype, we find that Buddies can guarantee
non-trivial anonymity set sizes in realistic chat/microblogging scenarios, for
both short-lived and long-lived pseudonyms.Comment: 15 pages, 8 figure
A Generic Construction of an Anonymous Reputation System and Instantiations from Lattices
With an anonymous reputation system one can realize the process of rating sellers anonymously in an online shop. While raters can stay anonymous, sellers still have the guarantee that they can be only be reviewed by raters who bought their product.We present the first generic construction of a reputation system from basic building blocks, namely digital signatures, encryption schemes, non-interactive zero-knowledge proofs, and linking indistinguishable tags. We then show the security of the reputation system in a strong security model. Among others, we instantiate the generic construction with building blocks based on lattice problems, leading to the first module lattice-based reputation system
A flexible architecture for privacy-aware trust management
In service-oriented systems a constellation of services cooperate, sharing potentially sensitive information and responsibilities. Cooperation is only possible if the different participants trust each other. As trust may depend on many different factors, in a flexible framework for Trust Management (TM) trust must be computed by combining different types of information. In this paper we describe the TAS3 TM framework which integrates independent TM systems into a single trust decision point. The TM framework supports intricate combinations whilst still remaining easily extensible. It also provides a unified trust evaluation interface to the (authorization framework of the) services. We demonstrate the flexibility of the approach by integrating three distinct TM paradigms: reputation-based TM, credential-based TM, and Key Performance Indicator TM. Finally, we discuss privacy concerns in TM systems and the directions to be taken for the definition of a privacy-friendly TM architecture.\u
Beaver: A Decentralized Anonymous Marketplace with Secure Reputation
Amid growing concerns of government surveillance
and corporate data sharing, web users increasingly demand tools
for preserving their privacy without placing trust in a third
party. Unfortunately, existing centralized reputation systems need
to be trusted for either privacy, correctness, or both. Existing
decentralized approaches, on the other hand, are either vulnerable to Sybil attacks, present inconsistent views of the network,
or leak critical information about the actions of their users.
In this paper, we present Beaver, a decentralized anonymous
marketplace that is resistant against Sybil attacks on vendor
reputation, while preserving user anonymity. Beaver allows its
participants to enjoy open enrollment, and provides every user
with the same global view of the reputation of other users through
public ledger based consensus. Various cryptographic primitives
allow Beaver to offer high levels of usability and practicality,
along with strong anonymity guarantees. Operations such as
creating a listing, purchasing an item, and leaving feedback take
just milliseconds to compute and require generating just a few
kilobytes of state while often constructing convenient anonymity
sets of hundreds of transactions
- âŠ