3 research outputs found
Bursting the Filter Bubble: Democracy, Design, and Ethics
Online web services such as Google and Facebook started using personalization algorithms. Because information is customized per user by the algorithms of these services, two users who use the same search query or have the same friend list may get different results. Online services argue that by using personalization algorithms, they may show the most relevant information for each user, hence increasing user satisfaction. However, critics argue that the opaque filters used by online services will only show agreeable political viewpoints to the users and the users never get challenged by opposing perspectives. Considering users are already biased in seeking like-minded perspectives, viewpoint diversity will diminish and the users may get trapped in a “filter bubble”. This is an undesired behavior for almost all democracy models. In this thesis we first analyzed the filter bubble phenomenon conceptually, by identifying internal processes and factors in online web services that might cause filter bubbles. Later, we analyzed this issue empirically. We first studied existing metrics in viewpoint diversity research of the computer science literature. We also extended these metrics by adding a new one, namely minority access from media and communication studies. After conducting an empirical study for Dutch and Turkish Twitter users, we showed that minorities cannot reach a large percentage of users in Turkish Twittersphere. We also analyzed software tools and design attempts to combat filter bubbles. We showed that almost all of the tools implement norms required by two popular democracy models. We argue that democracy is essentially a contested concept, and other less popular democracy models should be included in the design of such tools as well.Values, Technology and InnovationTechnology, Policy and Managemen
Values in the filter bubble Ethics of Personalization Algorithms in Cloud Computing
Cloud services such as Facebook and Google search started to use personalization algorithms in order to deal with growing amount of data online. This is often done in order to reduce the “information overload”. User’s interaction with the system is recorded in a single identity, and the information is personalized for the user using this identity. However, as we argue, such filters often ignore the context of information and they are never value neutral. These algorithms operate without the control and knowledge of the user, leading to a “filter bubble”. In this paper, by building on existing philosophical work, we discuss three human values implicated in personalized filtering: autonomy, identity, and transparency.Values and TechnologyTechnology, Policy and Managemen
Analyzing viewpoint diversity in twitter
Information diversity has a long tradition in human history. Recently there have been claims that diversity is diminishing in information available in social networks. On the other hand, some studies suggest that diversity is actually quite high in social networks such as Twitter. However these studies only focus on the concept of source diversity and they only focus on American users. In this paper we analyze different dimensions of diversity. We also provide an experimental design in which an empirical study can be performed to compare different concepts of diversity for Twitter users from different countries.Values and TechnologyTechnology, Policy and Managemen