2 research outputs found
Quantifying the Impact of Cognitive Biases in Question-Answering Systems
Crowdsourcing can identify high-quality solutions to problems; however,
individual decisions are constrained by cognitive biases. We investigate some
of these biases in an experimental model of a question-answering system. In
both natural and controlled experiments, we observe a strong position bias in
favor of answers appearing earlier in a list of choices. This effect is
enhanced by three cognitive factors: the attention an answer receives, its
perceived popularity, and cognitive load, measured by the number of choices a
user has to process. While separately weak, these effects synergistically
amplify position bias and decouple user choices of best answers from their
intrinsic quality. We end our paper by discussing the novel ways we can apply
these findings to substantially improve how high-quality answers are found in
question-answering systems.Comment: 9 pages, 5 figure