39 research outputs found

    A Guide for Social Science Journal Editors on Easing into Open Science

    Get PDF
    Journal editors have a large amount of power to advance open science in their respective fields by incentivising and mandating open policies and practices at their journals. The Data PASS Journal Editors Discussion Interface (JEDI, an online community for social science journal editors: www.dpjedi.org) has collated several resources on embedding open science in journal editing (www.dpjedi.org/resources). However, it can be overwhelming as an editor new to open science practices to know where to start. For this reason, we created a guide for journal editors on how to get started with open science. The guide outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This manuscript introduces and summarizes the guide (full guide: https://osf.io/hstcx).<br/

    Do Participants Want Their Data to Be Shared?

    No full text
    Poster presented at SPSP 201

    Using Machine Learning To Explore Social Behavior In Large Image Datasets

    No full text
    Poster presented at SPSP 201

    Detecting Social Content in News Images of Politically Polarizing Events: A Machine Learning Approach

    No full text
    Poster presented at APS 201

    The Effects of Political Polarization on News Consumption and Trust

    No full text
    Poster at SPSP201

    Julia G. Bottesini's Quick Files

    No full text
    The Quick Files feature was discontinued and it’s files were migrated into this Project on March 11, 2022. The file URL’s will still resolve properly, and the Quick Files logs are available in the Project’s Recent Activity

    Research Practices in Psychology and How We Communicate About Them

    No full text
    This dissertation attempts to examine research practices and the way we communicate about them in parts of the research process that may not always be at the forefront of people’s minds. When researchers recruit participants for their studies, do we ever wonder what they think about how we treat their data? In Chapter 1, I examined psychology research participants’ opinions about (mostly) common research practices in psychology, including questionable research practices (QRPs; e.g., p-hacking, HARKing) and practices to increase transparency and replicability. After running a study, researchers then write it up as a manuscript, which is how most research gets communicated to relevant stakeholders. But do different groups of researchers communicate their findings differently? In Chapter 2, I investigated which groups of researchers might be more or less prone to hedging their conclusions in their research articles, a first step towards better understanding when and why researchers make strong claims about their findings. Finally, when findings get disseminated to the public, which research practices are being rewarded with media attention? In Chapter 3, I explored what information science journalists use when evaluating psychology findings’ trustworthiness and newsworthiness. By examining these often-forgotten aspects of research practices and their consequences, I hope to encourage more research on how we do and communicate psychological science

    Do participants care if we p-hack their data?

    No full text
    Poster presented at Metascience 201
    corecore