16 research outputs found

    Many Labs 5:Testing pre-data collection peer review as an intervention to increase replicability

    Get PDF
    Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3?9; median total sample = 1,279.5, range = 276?3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (?r = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00?.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19?.50)

    Master of Science

    No full text
    thesisEmerging research investigating psychophysiological and neurobiological indicators of posttraumatic stress disorder (PTSD) suggests that posttraumatic stress symptoms (PTSS) may be characterized by two distinct symptom patterns, termed overmodulation and undermodulation. However, little of this research has been conducted among youth, and none has directly investigated whether psychophysiological responses may serve as indicators of these forms of affect dysregulation. To address this gap in the literature, this study investigated whether the over- versus undermodulation patterns emerged among a sample of 822 detained adolescents (185 girls, 596 boys) and tested whether autonomic stress reactivity and recovery reliably corresponded to patterns of emotional over- and undermodulation. Among boys in the sample, three profiles emerged that largely reflected low PTSS, overmodulation, and undermodulation; among girls in the sample, only two profiles emerged, reflecting low PTSS and overmodulation. However, class membership for boys and girls was not associated with distinct patterns of physiological response. The current findings offer evidence that symptom patterns of over- and undermodulation may be distinguishable in youth but do not yet provide support for physiological indicators of these profiles

    Actually available, correct, usable, and complete materials.

    No full text
    <p>Percentage of articles with materials reported available at an independent archive or personal website that were actually available, had correct materials, had usable materials, and had complete materials. Once <i>Psychological Science</i> started offering badges, some articles reported availability but did not earn a badge, and others reported availability and did earn a badge. These are represented separately. Total number of articles reported in data points. Underlying data (<a href="https://osf.io/8ds2g/" target="_blank">https://osf.io/8ds2g/</a>) and scripts (<a href="https://osf.io/f7kqr/" target="_blank">https://osf.io/f7kqr/</a>) to reproduce this figure are available on the Open Science Framework.</p

    Article coding scheme.

    No full text
    <p>A visual illustration of the full coding scheme used to evaluate the availability of data and materials. This figure is available for download on the Open Science Framework at <a href="https://osf.io/kjsxv/" target="_blank">https://osf.io/kjsxv/</a>.</p

    Actually available, correct, usable, and complete data.

    No full text
    <p>Percentage of articles with data reported available at an independent archive or personal website that were actually available, had correct data, had usable data, and had complete data. Once <i>Psychological Science</i> started offering badges, some articles reported availability but either did not apply for or earn a badge; others reported availability and did earn a badge. These are represented separately. Total number of articles reported in data points. Underlying data (<a href="https://osf.io/srgjb/" target="_blank">https://osf.io/srgjb/</a>) and scripts (<a href="https://osf.io/d78cf/" target="_blank">https://osf.io/d78cf/</a>) to reproduce this figure are available on the Open Science Framework.</p

    Reportedly available materials.

    No full text
    <p>Percentage of articles reporting open materials by half year by journal. Darker line indicates <i>Psychological Science</i>, and dotted red line indicates when badges were introduced in <i>Psychological Science</i> and none of the comparison journals. Underlying data (<a href="https://osf.io/a29bt/" target="_blank">https://osf.io/a29bt/</a>) and scripts (<a href="https://osf.io/bdtnq/" target="_blank">https://osf.io/bdtnq/</a>) to reproduce this figure are available on the Open Science Framework.</p

    Reportedly available data.

    No full text
    <p>Percentage of articles reporting open data by half year by journal. Darker line indicates <i>Psychological Science</i>, and dotted red line indicates when badges were introduced in <i>Psychological Science</i> and none of the comparison journals. Underlying data (<a href="https://osf.io/a29bt/" target="_blank">https://osf.io/a29bt/</a>) and scripts (<a href="https://osf.io/bdtnq/" target="_blank">https://osf.io/bdtnq/</a>) to reproduce this figure can be found on the Open Science Framework.</p
    corecore