Open Moderation for Democratic Fact Checking

Abstract

We consume a lot of information on the web, yet it is relatively hard to know whether what we read is true and unbiased. Mainstream content sharing platforms like YouTube implement recommendation algorithms that ensure the user is most likely to agree with what (s)he sees. Content that contains controversial claims often suffers from sampling bias that skews their like-dislike ratio. This shows that such platforms are rather useless to determine whether something is true. It is also easy to cherry pick facts that favour one side of the issue. This work presents a solution for mitigating these problems by using open moderation and a simple variation of liquid democracy with linking capabilities built on top and a Google Chrome extension that shows the relevant information when needed. Target users are people who value truth more than hearing what they already believe

    Similar works