623 research outputs found

    Computational fact checking from knowledge networks

    Get PDF
    Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation

    Incremental Discovery of Prominent Situational Facts

    Full text link
    We study the novel problem of finding new, prominent situational facts, which are emerging statements about objects that stand out within certain contexts. Many such facts are newsworthy---e.g., an athlete's outstanding performance in a game, or a viral video's impressive popularity. Effective and efficient identification of these facts assists journalists in reporting, one of the main goals of computational journalism. Technically, we consider an ever-growing table of objects with dimension and measure attributes. A situational fact is a "contextual" skyline tuple that stands out against historical tuples in a context, specified by a conjunctive constraint involving dimension attributes, when a set of measure attributes are compared. New tuples are constantly added to the table, reflecting events happening in the real world. Our goal is to discover constraint-measure pairs that qualify a new tuple as a contextual skyline tuple, and discover them quickly before the event becomes yesterday's news. A brute-force approach requires exhaustive comparison with every tuple, under every constraint, and in every measure subspace. We design algorithms in response to these challenges using three corresponding ideas---tuple reduction, constraint pruning, and sharing computation across measure subspaces. We also adopt a simple prominence measure to rank the discovered facts when they are numerous. Experiments over two real datasets validate the effectiveness and efficiency of our techniques

    Automated Fact Checking in the News Room

    Get PDF
    Fact checking is an essential task in journalism; its importance has been highlighted due to recently increased concerns and efforts in combating misinformation. In this paper, we present an automated fact-checking platform which given a claim, it retrieves relevant textual evidence from a document collection, predicts whether each piece of evidence supports or refutes the claim, and returns a final verdict. We describe the architecture of the system and the user interface, focusing on the choices made to improve its user-friendliness and transparency. We conduct a user study of the fact-checking platform in a journalistic setting: we integrated it with a collection of news articles and provide an evaluation of the platform using feedback from journalists in their workflow. We found that the predictions of our platform were correct 58\% of the time, and 59\% of the returned evidence was relevant

    Computational journalism in the UK newsroom

    Get PDF
    As new forms of multimedia, data-driven storytelling are produced by news organisations around the world, programming skills are increasingly required in newsrooms to conduct data analysis and create interactive tools and news apps. This has prompted some universities to combine journalism courses with computer skills and there is much hype about the emergence of hybrid programmer-journalists, journo-coders, journo-devs who are equally proficient writing code and copy. To date, most of the academic research into computational journalism in the newsroom has been restricted to the United States where studies suggest a model whereby the roles of journalist and programmer are merged. There is, therefore, a need to identify the extent to which this organisational model is replicated in newsrooms in other parts of the world. This paper is an exploratory study into two news organisations in the UK – the BBC and the Financial Times – to investigate the extent to which journalism skills and programming skills are being combined and the different professional identities being created. This study finds that the journalists and programmers are considered as two distinct professions and the idea of a hybrid role is rejected by the newsroom staff interviewed. A new model is identified in the newsroom whereby teams consisting of journalists, programmers and designers work closely together on interactive, data-driven projects. These findings are valuable to journalism educators in that they identify the technical skills and attitudes required by journalists working on innovative storytelling formats

    Social Epistemology as a New Paradigm for Journalism and Media Studies

    Get PDF
    Journalism and media studies lack robust theoretical concepts for studying journalistic knowledge ‎generation. More specifically, conceptual challenges attend the emergence of big data and ‎algorithmic sources of journalistic knowledge. A family of frameworks apt to this challenge is ‎provided by “social epistemology”: a young philosophical field which regards society’s participation ‎in knowledge generation as inevitable. Social epistemology offers the best of both worlds for ‎journalists and media scholars: a thorough familiarity with biases and failures of obtaining ‎knowledge, and a strong orientation toward best practices in the realm of knowledge-acquisition ‎and truth-seeking. This paper articulates the lessons of social epistemology for two central nodes of ‎knowledge-acquisition in contemporary journalism: human-mediated knowledge and technology-‎mediated knowledge.
    corecore