2 research outputs found

    Tragedy of the Commons - A Critical Study of Data Quality and Validity Issues in Crowd Work-Based Research

    Get PDF
    Crowd work platforms such as MTurk have been leveraged by academic scholars to conduct research and collect data. Though prior studies have discussed data quality and validity issues in crowd work via surveys and experiments, they kind of neglected to explore the scholars’ and particularly the IRB’s ethical concerns in these respects. In this study, we interviewed 17 scholars in six different disciplines and 15 IRB directors/analysts in the U.S. to fill this research gap. We identified common themes among our respondents but also discovered distinctive and even opposing views regarding the approval rate, rejection, internal and external research validity. Based on the findings, we discussed a potential Tragedy of the Commons regarding the data quality deterioration and the disciplinary differences regarding validity in crowd work-based research. Finally, we advocated that the IRB’s ethical concerns should be heard and respected

    What do crowd workers think about creative work?

    No full text
    Abstract Crowdsourcing platforms are a powerful and convenient means for recruiting participants in online studies and collecting data from the crowd. As information work is being more and more automated by Machine Learning algorithms, creativity — that is, a human’s ability for divergent and convergent thinking — will play an increasingly important role on online crowdsourcing platforms. However, we lack insights into what crowd workers think about creative work. In studies in Human-Computer Interaction (HCI), the ability and willingness of the crowd to participate in creative work seems to be largely unquestioned. Insights into the workers’ perspective are rare, but important, as they may inform the design of studies with higher validity. Given that creativity will play an increasingly important role in crowdsourcing, it is imperative to develop an understanding of how workers perceive creative work. In this paper, we summarize our recent worker-centered study of creative work on two general-purpose crowdsourcing platforms (Amazon Mechanical Turk and Prolific). Our study illuminates what creative work is like for crowd workers on these two crowdsourcing platforms. The work identifies several archetypal types of workers with different attitudes towards creative work, and discusses common pitfalls with creative work on crowdsourcing platforms
    corecore