11 research outputs found

    NAVIGATING THE POST-TRUTH ERA: TRUST, MISINFORMATION, AND CREDIBILITY ASSESSMENT ON ONLINE SOCIAL MEDIA

    Get PDF
    As access to news is increasingly mediated through social media platforms, there are rising concerns for citizens’ ability to evaluate online information and detect potentially misleading items. While many studies have reported on how people assess the credibility of information, there are few reports on processes related to evaluating information online and people’s decision to trust and share the information with others. This paper reports on the first part of a three-phase study which aimed to gain an in-depth understanding of citizens’ practices and needs in assessing the credibility of information shared online and co-create solutions to address this problem. Data were collected from three European countries, through a survey on misinformation perceptions, focus groups, follow-up individual interviews, and co-creation activities with three stakeholder groups. The data were analyzed qualitatively, using, primarily, a grounded theory approach. Results from the citizens’ stakeholder group indicate that personal biases, emotions, time constraints, and lack of supporting technologies impacts the credibility assessment of online news. Study participants also discussed the need for increased media literacy actions, especially in youth. Based on preliminary findings we argue that we need a diversified approach to support citizens’ resilience against the spread of misinformation

    The Role of Epistemic Emotions During Engagement with Online Information Encounters

    No full text
    This exploratory case study aimed to understand to what extent epistemic emotions are connected to effortful engagement when young adults were incidentally exposed to textual information snippets on a simulated social media timeline. Using discourse analysis, we draw on think-aloud data from fifteen young adults, to identify participants' emotional reactions to different textual snippets and to examine when these might prompt effortful engagement, as indicated by reliance on epistemic beliefs, source evaluation, evidence evaluation or science literacy. Our findings provide two important insights. We found that during online information encounters, frustration, curiosity and confusion were co-occurring with effortful engagement and were triggered, in particular, by posts that included negative words. We also found that epistemic engagement occurred even when boredom was verbalized first and was followed by frustration or confusion. We discuss the implications of this work for informal and formal learning environments

    Widening Participation to Events that Encourage Teenagers to take an Interest in Computer Science

    No full text
    Many developed economies across the world are confronted with a significant challenge: A shortage of programming-skilledworkforce. The fact that fewer women select to study computer science further aggravates the challenge. Multiple national and international efforts were initiated to tackle the issue, by means of highlighting the value of programming skills and thus encouraging high-school students to consider computer science as a career path. This paper discusses the motivation, challenges, and lessons learned from organising events that encourage teenagers to take an interest in computer science, as well as findings related to gender balance in computing professions

    SOCIAL MEDIA USE, TRUST AND TECHNOLOGY ACCEPTANCE: INVESTIGATING THE EFFECTIVENESS OF A CO-CREATED BROWSER PLUGIN IN MITIGATING THE SPREAD OF MISINFORMATION ON SOCIAL MEDIA

    Get PDF
    Social media have become online spaces where misinformation abounds and spreads virally in the absence of professional gatekeeping. This information landscape requires everyday citizens, who rely on these technologies to access information, to cede control of information. This work sought to examine whether the control of information can be regained by humans with the support of a co-created browser plugin, which integrated credibility labels and nudges, and was informed by artificial intelligence models and rule engines. Given the literature on the complexity of information evaluation on social media, we investigated the role of technological, situational and individual characteristics in “liking” or “sharing” misinformation. We adopted a mixed-methods research design with 80 participants from four European sites, who viewed a curated timeline of credible and non-credible posts on Twitter, with (n=40) or without (n=40) the presence of the plugin. The role of the technological intervention was important: the absence of the plugin strongly correlated with misinformation endorsement (via “liking”). Trust in the technology and technology acceptance were correlated and emerged as important situational characteristics, with participants with higher trust profiles being less likely to share misinformation. Findings on individual characteristics indicated that only social media use was a significant predictor for trusting the plugin. This work extends ongoing research on deterring the spread of misinformation by situating the findings in an authentic social media environment using a co-created technological intervention. It holds implications for how to support a misinformation-resilient citizenry with the use of artificial intelligence-driven tools

    Combating misinformation online: re-imagining social media for policy-making

    Get PDF
    Social media have created communication channels between citizens and policymakers but are also susceptible to rampant misinformation. This new context demands new social media policies that can aid policymakers in making evidence-based decisions for combating misinformation online. This paper reports on data collected from policymakers in Austria, Greece, and Sweden, using focus groups and in-depth interviews. Analyses provide insights into challenges and identify four important themes for supporting policy-making for combating misinformation: a) creating a trusted network of experts and collaborators, b) facilitating the validation of online information, c) providing access to visualisations of data at different levels of granularity, and d) increasing the transparency and explainability of flagged misinformative content. These recommendations have implications for rethinking how revised social media policies can contribute to evidence-based decision-making

    Trust in scientists and their role in society across 67 countries

    No full text
    Scientific information is crucial for evidence-based decision-making. Public trust in science can help decision-makers act based on the best available evidence, especially during crises such as climate change or the COVID-19 pandemic. However, in recent years the epistemic authority of science has been challenged, causing concerns about low public trust in scientists. Here we interrogated these concerns with a pre-registered 67-country survey of 71,417 respondents on all inhabited continents and find that in most countries, a majority of the public trust scientists and think that scientists should be more engaged in policymaking. We further show that there is a discrepancy between the public’s perceived and desired priorities of scientific research. Moreover, we find variations between and within countries, which we explain with individual-and country-level variables,including political orientation. While these results do not show widespread lack of trust in scientists, we cannot discount the concern that lack of trust in scientists by even a small minority may affect considerations of scientific evidence in policymaking. These findings have implications for scientists and policymakers seeking to maintain and increase trust in scientists
    corecore