5 research outputs found
Characterising User Content on a Multi-lingual Social Network
Social media has been on the vanguard of political information diffusion in
the 21st century. Most studies that look into disinformation, political
influence and fake-news focus on mainstream social media platforms. This has
inevitably made English an important factor in our current understanding of
political activity on social media. As a result, there has only been a limited
number of studies into a large portion of the world, including the largest,
multilingual and multi-cultural democracy: India. In this paper we present our
characterisation of a multilingual social network in India called ShareChat. We
collect an exhaustive dataset across 72 weeks before and during the Indian
general elections of 2019, across 14 languages. We investigate the cross
lingual dynamics by clustering visually similar images together, and exploring
how they move across language barriers. We find that Telugu, Malayalam, Tamil
and Kannada languages tend to be dominant in soliciting political images (often
referred to as memes), and posts from Hindi have the largest cross-lingual
diffusion across ShareChat (as well as images containing text in English). In
the case of images containing text that cross language barriers, we see that
language translation is used to widen the accessibility. That said, we find
cases where the same image is associated with very different text (and
therefore meanings). This initial characterisation paves the way for more
advanced pipelines to understand the dynamics of fake and political content in
a multi-lingual and non-textual setting.Comment: Accepted at ICWSM 2020, please cite the ICWSM versio
Characterising User Content on a Multi-lingual Social Network
Social media has been on the vanguard of political infor- mation diffusion in the 21st century. Most studies that look into disinformation, political influence and fake-news focus on mainstream social media platforms. This has inevitably made English an important factor in our current understand- ing of political activity on social media. As a result, there has only been a limited number of studies into a large portion of the world, including the largest, multilingual and multi- cultural democracy: India. In this paper we present our char- acterisation of a multilingual social network in India called ShareChat. We collect an exhaustive dataset across 72 weeks before and during the Indian general elections of 2019, across 14 languages. We investigate the cross lingual dynamics by clustering visually similar images together, and exploring how they move across language barriers. We find that Tel- ugu, Malayalam, Tamil and Kannada languages tend to be dominant in soliciting political images (often referred to as memes), and posts from Hindi have the largest cross-lingual diffusion across ShareChat (as well as images containing text in English). In the case of images containing text that cross language barriers, we see that language translation is used to widen the accessibility. That said, we find cases where the same image is associated with very different text (and there- fore meanings). This initial characterisation paves the way for more advanced pipelines to understand the dynamics of fake and political content in a multi-lingual and non-textual setting
Wikipedia and Westminster: Quality and Dynamics of Wikipedia Pages about UK Politicians
Wikipedia is a major source of information providing a large variety of
content online, trusted by readers from around the world. Readers go to
Wikipedia to get reliable information about different subjects, one of the most
popular being living people, and especially politicians. While a lot is known
about the general usage and information consumption on Wikipedia, less is known
about the life-cycle and quality of Wikipedia articles in the context of
politics. The aim of this study is to quantify and qualify content production
and consumption for articles about politicians, with a specific focus on UK
Members of Parliament (MPs). First, we analyze spatio-temporal patterns of
readers' and editors' engagement with MPs' Wikipedia pages, finding huge peaks
of attention during election times, related to signs of engagement on other
social media (e.g. Twitter). Second, we quantify editors' polarisation and find
that most editors specialize in a specific party and choose specific news
outlets as references. Finally we observe that the average citation quality is
pretty high, with statements on 'Early life and career' missing citations most
often (18%).Comment: A preprint of accepted publication at the 31ST ACM Conference on
Hypertext and Social Media (HT'20