Skip to main content
Article thumbnail
Location of Repository

Verification in referral-based crowdsourcing

By Victor Naroditskiy, Iyad Rahwan, Manuel Cebrian and Nicholas R. Jennings


Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge

Topics: QA75
Year: 2012
OAI identifier:
Provided by: e-Prints Soton

Suggested articles


  1. (2012). (how) will the revolution be retweeted?: information diffusion and the 2011 egyptian uprising. In: doi
  2. (1953). A value for n-person games. contribution to the theory of games. doi
  3. (2003). An experimental study of search in global social networks. doi
  4. (2007). Complex contagions and the weakness of long ties. doi
  5. (2011). Crowdsourcing quality control of online information: a quality-based cascade model. Social Computing, Behavioral-Cultural Modeling and Prediction: doi
  6. (2009). Data quality from crowdsourcing: a study of annotation selection criteria. In: doi
  7. (2011). Everyone’s an inuencer: quantifying inuence on twitter. In: doi
  8. (2012). Finding red balloons with ‘‘split’’ contracts: robustness to individuals’ selfishness. In: doi
  9. (2011). Harnessing the crowdsourcing power of social media for disaster relief. Intelligent Systems, doi
  10. (2010). How reliable are annotations via crowdsourcing: a study about interannotator agreement for multi-label image annotation. In: doi
  11. (2005). How to search a social network. doi
  12. (2011). Human computation: a survey and taxonomy of a growing field. In: doi
  13. (2012). Identifying inuential and susceptible members of social networks. doi
  14. (2002). Identity and search in social networks. doi
  15. (2009). Impact of human activity patterns on the dynamics of information diffusion. Physical review letters 103: doi
  16. (2007). Inuentials, networks, and public opinion formation. doi
  17. (2008). Mechanism design: How to implement social goals. doi
  18. (2010). Microblogging during two natural hazards events: what twitter may contribute to situational awareness. In: doi
  19. (2000). Navigation in a small world. doi
  20. (2007). Outperforming the competition in multi-unit sealed bid auctions. In: doi
  21. (2010). Pass it on?: Retweeting in mass emergency. In:
  22. (2011). Reecting on the DARPA red balloon challenge. doi
  23. (2005). Searchability of networks. doi
  24. (2011). Small but slow world: How network topology and burstiness slow down spreading. doi
  25. (2007). Spiteful bidding in sealed-bid auctions. In:
  26. (2010). Spontaneous emergence of social inuence in online systems. doi
  27. (2007). The dynamics of viral marketing. doi
  28. (2010). The multidimensional wisdom of crowds. In: doi
  29. (2010). The red balloon. doi
  30. (1967). The small world problem. doi
  31. (2010). The spread of behavior in an online social network experiment. doi
  32. (1973). The strength of weak ties. doi
  33. (2011). Time-critical social mobilization. doi
  34. (2008). Tracing information ow on a global scale using internet chainletter data. doi
  35. (2005). Tracking information epidemics in blogspace. In: Web Intelligence, doi
  36. (2011). UC San Diego team’s effort in DARPA’s Shredder Challenge derailed by sabotage. UC San Diego Press Release (http://calit2net/newsroom/ articlephp?id=1938).
  37. (2010). Web Science Trust.
  38. (2012). Zohar A doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.