2 research outputs found

    Worker Performance in a Situated Crowdsourcing Market

    No full text
    We present an empirical study that investigates crowdsourcing performance in a situated market. Unlike online markets, situated crowdsourcing markets consist of workers who become serendipitously available for work in a particular location and context. So far, the literature has lacked a systematic study of task performance and uptake in such markets under varying incentives. In a 3-week field study, we demonstrate that in a situated crowdsourcing market, task uptake and accuracy are generally comparable with online markets. We also show that increasing task rewards in situated crowdsourcing leads to increased task uptake but not accuracy, while decreasing task rewards leads to decreases in both task uptake and accuracy. RESEARCH HIGHLIGHTS • We present a 3-week empirical study on worker performance in a situated crowdsourcing market. • We manipulate task rewards to investigate its effects on performance. • Increasing task rewards led to increased task uptake but not accuracy. • Decreasing task rewards led to decreased task uptake and accuracy. • We compare the performance of our reported tasks and literature using several types of crowdsourcing

    Worker Performance in a Situated Crowdsourcing Market

    No full text
    We present an empirical study that investigates crowdsourcing performance in a situated market. Unlike online markets, situated crowdsourcing markets consist of workers who become serendipitously available for work in a particular location and context. So far, the literature has lacked a systematic study of task performance and uptake in such markets under varying incentives. In a 3-week field study, we demonstrate that in a situated crowdsourcing market, task uptake and accuracy are generally comparable with online markets. We also show that increasing task rewards in situated crowdsourcing leads to increased task uptake but not accuracy, while decreasing task rewards leads to decreases in both task uptake and accuracy. RESEARCH HIGHLIGHTS • We present a 3-week empirical study on worker performance in a situated crowdsourcing market. • We manipulate task rewards to investigate its effects on performance. • Increasing task rewards led to increased task uptake but not accuracy. • Decreasing task rewards led to decreased task uptake and accuracy. • We compare the performance of our reported tasks and literature using several types of crowdsourcing
    corecore