6 research outputs found

    Crowdsourcing hypothesis tests: Making transparent how design choices shape research results

    Get PDF
    To what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer fiveoriginal research questions related to moral judgments, negotiations, and implicit cognition. Participants from two separate large samples (total N > 15,000) were then randomly assigned to complete one version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: materials from different teams renderedstatistically significant effects in opposite directions for four out of five hypotheses, with the narrowest range in estimates being d = -0.37 to +0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for two hypotheses, and a lack of support for three hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, while considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim.</div

    sj-docx-1-psp-10.1177_01461672231219391 – Supplemental material for Children Value Animals More Than Adults Do: A Conceptual Replication and Extension

    No full text
    Supplemental material, sj-docx-1-psp-10.1177_01461672231219391 for Children Value Animals More Than Adults Do: A Conceptual Replication and Extension by Mariola Paruzel-Czachura, Maximilian Maier, Roksana Warmuz, Matti Wilks and Lucius Caviola in Personality and Social Psychology Bulletin</p

    Crowdsourcing hypothesis tests: making transparent how design choices shape research results

    Get PDF
    To what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer five original research questions related to moral judgments, negotiations, and implicit cognition. Participants from two separate large samples (total N > 15,000) were then randomly assigned to complete one version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: materials from different teams rendered statistically significant effects in opposite directions for four out of five hypotheses, with the narrowest range in estimates being d = -0.37 to +0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for two hypotheses, and a lack of support for three hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, while considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim

    Addressing climate change with behavioral science: A global intervention tournament in 63 countries

    No full text
    Effectively reducing climate change requires marked, global behavior change. However, it is unclear which strategies are most likely to motivate people to change their climate beliefs and behaviors. Here, we tested 11 expert-crowdsourced interventions on four climate mitigation outcomes: beliefs, policy support, information sharing intention, and an effortful tree-planting behavioral task. Across 59,440 participants from 63 countries, the interventions’ effectiveness was small, largely limited to nonclimate skeptics, and differed across outcomes: Beliefs were strengthened mostly by decreasing psychological distance (by 2.3%), policy support by writing a letter to a future-generation member (2.6%), information sharing by negative emotion induction (12.1%), and no intervention increased the more effortful behavior—several interventions even reduced tree planting. Last, the effects of each intervention differed depending on people’s initial climate beliefs. These findings suggest that the impact of behavioral climate interventions varies across audiences and target behaviors

    Addressing climate change with behavioral science: A global intervention tournament in 63 countries

    No full text
    Effectively reducing climate change requires marked, global behavior change. However, it is unclear which strategies are most likely to motivate people to change their climate beliefs and behaviors. Here, we tested 11 expert-crowdsourced interventions on four climate mitigation outcomes: beliefs, policy support, information sharing intention, and an effortful tree-planting behavioral task. Across 59,440 participants from 63 countries, the interventions' effectiveness was small, largely limited to nonclimate skeptics, and differed across outcomes: Beliefs were strengthened mostly by decreasing psychological distance (by 2.3%), policy support by writing a letter to a future-generation member (2.6%), information sharing by negative emotion induction (12.1%), and no intervention increased the more effortful behavior-several interventions even reduced tree planting. Last, the effects of each intervention differed depending on people's initial climate beliefs. These findings suggest that the impact of behavioral climate interventions varies across audiences and target behaviors.</p

    Addressing climate change with behavioral science: A global intervention tournament in 63 countries

    No full text
    Effectively reducing climate change requires marked, global behavior change. However, it is unclear which strategies are most likely to motivate people to change their climate beliefs and behaviors. Here, we tested 11 expert-crowdsourced interventions on four climate mitigation outcomes: beliefs, policy support, information sharing intention, and an effortful tree-planting behavioral task. Across 59,440 participants from 63 countries, the interventions' effectiveness was small, largely limited to nonclimate skeptics, and differed across outcomes: Beliefs were strengthened mostly by decreasing psychological distance (by 2.3%), policy support by writing a letter to a future-generation member (2.6%), information sharing by negative emotion induction (12.1%), and no intervention increased the more effortful behavior-several interventions even reduced tree planting. Last, the effects of each intervention differed depending on people's initial climate beliefs. These findings suggest that the impact of behavioral climate interventions varies across audiences and target behaviors.</p
    corecore