623 research outputs found
Recommended from our members
Algorithms, Automation, and News
This special issue examines the growing importance of algorithms and automation in the gathering, composition, and distribution of news. It connects a long line of research on journalism and computation with scholarly and professional terrain yet to be explored. Taken as a whole, these articles share some of the noble ambitions of the pioneering publications on ‘reporting algorithms’, such as a desire to see computing help journalists in their watchdog role by holding power to account. However, they also go further, firstly by addressing the fuller range of technologies that computational journalism now consists of: from chatbots and recommender systems, to artificial intelligence and atomised journalism. Secondly, they advance the literature by demonstrating the increased variety of uses for these technologies, including engaging underserved audiences, selling subscriptions, and recombining and re-using content. Thirdly, they problematize computational journalism by, for example, pointing out some of the challenges inherent in applying AI to investigative journalism and in trying to preserve public service values. Fourthly, they offer suggestions for future research and practice, including by presenting a framework for developing democratic news recommenders and another that may help us think about computational journalism in a more integrated, structured manner
Recommended from our members
Computational Journalism
This chapter provides a summary of, and commentary on, academic studies focused on computational journalism. This chapter considers computational journalism to be the advanced application of computing, algorithms, and automation to the gathering, evaluation, composition, presentation, and distribution of news. As will be shown, the focus of computational journalism’s literature has broadened over time. An initial emphasis on searching for and analyzing data as part of investigative journalism endeavors has faded as automated news writing, novel forms of interactive news presentation, and personalized news distribution have been addressed. There has also been a growing critical engagement, tempering the early, broadly optimistic analyses with more realistic assessments of computation’s effects on the practice of journalism, its content, and reception. The chapter ends with a discussion of how the literature is evolving, addressing new practices — such as “sensor journalism” and interactive chatbots—and also questioning whether computational journalism’s technical essence has been adequately addressed by the sociological contributions to its current corpus
Computational fact checking from knowledge networks
Traditional fact checking by expert journalists cannot keep up with the
enormous volume of information that is now generated online. Computational fact
checking may significantly enhance our ability to evaluate the veracity of
dubious information. Here we show that the complexities of human fact checking
can be approximated quite well by finding the shortest path between concept
nodes under properly defined semantic proximity metrics on knowledge graphs.
Framed as a network problem this approach is feasible with efficient
computational techniques. We evaluate this approach by examining tens of
thousands of claims related to history, entertainment, geography, and
biographical information using a public knowledge graph extracted from
Wikipedia. Statements independently known to be true consistently receive
higher support via our method than do false ones. These findings represent a
significant step toward scalable computational fact-checking methods that may
one day mitigate the spread of harmful misinformation
Incremental Discovery of Prominent Situational Facts
We study the novel problem of finding new, prominent situational facts, which
are emerging statements about objects that stand out within certain contexts.
Many such facts are newsworthy---e.g., an athlete's outstanding performance in
a game, or a viral video's impressive popularity. Effective and efficient
identification of these facts assists journalists in reporting, one of the main
goals of computational journalism. Technically, we consider an ever-growing
table of objects with dimension and measure attributes. A situational fact is a
"contextual" skyline tuple that stands out against historical tuples in a
context, specified by a conjunctive constraint involving dimension attributes,
when a set of measure attributes are compared. New tuples are constantly added
to the table, reflecting events happening in the real world. Our goal is to
discover constraint-measure pairs that qualify a new tuple as a contextual
skyline tuple, and discover them quickly before the event becomes yesterday's
news. A brute-force approach requires exhaustive comparison with every tuple,
under every constraint, and in every measure subspace. We design algorithms in
response to these challenges using three corresponding ideas---tuple reduction,
constraint pruning, and sharing computation across measure subspaces. We also
adopt a simple prominence measure to rank the discovered facts when they are
numerous. Experiments over two real datasets validate the effectiveness and
efficiency of our techniques
Automated Fact Checking in the News Room
Fact checking is an essential task in journalism; its importance has been
highlighted due to recently increased concerns and efforts in combating
misinformation. In this paper, we present an automated fact-checking platform
which given a claim, it retrieves relevant textual evidence from a document
collection, predicts whether each piece of evidence supports or refutes the
claim, and returns a final verdict. We describe the architecture of the system
and the user interface, focusing on the choices made to improve its
user-friendliness and transparency. We conduct a user study of the
fact-checking platform in a journalistic setting: we integrated it with a
collection of news articles and provide an evaluation of the platform using
feedback from journalists in their workflow. We found that the predictions of
our platform were correct 58\% of the time, and 59\% of the returned evidence
was relevant
Computational journalism in the UK newsroom
As new forms of multimedia, data-driven storytelling are produced by news organisations around the world, programming skills are increasingly required in newsrooms to conduct data analysis and create interactive tools and news apps. This has prompted some universities to combine journalism courses with computer skills and there is much hype about the emergence of hybrid programmer-journalists, journo-coders, journo-devs who are equally proficient writing code and copy. To date, most of the academic research into computational journalism in the newsroom has been restricted to the United States where studies suggest a model whereby the roles of journalist and programmer are merged. There is, therefore, a need to identify the extent to which this organisational model is replicated in newsrooms in other parts of the world. This paper is an exploratory study into two news organisations in the UK – the BBC and the Financial Times – to investigate the extent to which journalism skills and programming skills are being combined and the different professional identities being created. This study finds that the journalists and programmers are considered as two distinct professions and the idea of a hybrid role is rejected by the newsroom staff interviewed. A new model is identified in the newsroom whereby teams consisting of journalists, programmers and designers work closely together on interactive, data-driven projects. These findings are valuable to journalism educators in that they identify the technical skills and attitudes required by journalists working on innovative storytelling formats
Social Epistemology as a New Paradigm for Journalism and Media Studies
Journalism and media studies lack robust theoretical concepts for studying journalistic knowledge generation. More specifically, conceptual challenges attend the emergence of big data and algorithmic sources of journalistic knowledge. A family of frameworks apt to this challenge is provided by “social epistemology”: a young philosophical field which regards society’s participation in knowledge generation as inevitable. Social epistemology offers the best of both worlds for journalists and media scholars: a thorough familiarity with biases and failures of obtaining knowledge, and a strong orientation toward best practices in the realm of knowledge-acquisition and truth-seeking. This paper articulates the lessons of social epistemology for two central nodes of knowledge-acquisition in contemporary journalism: human-mediated knowledge and technology-mediated knowledge.
- …