6 research outputs found

    Emerging Adulthood Measured at Multiple Institutions 2: The Data

    Get PDF
    Collaborators from 32 academic institutions primarily in the United States collected data from emerging adults (N'raw''' = 4220, N'processed' = 3134). Participants completed self-report measures assessing markers of adulthood, IDEA inventory of dimensions of emerging adulthood, subjective well-being, mindfulness, belonging, self-efficacy, disability identity, somatic health, perceived stress, perceived social support, social media use, political affiliation, beliefs about the American dream, interpersonal transgressions, narcissism, interpersonal exploitativeness, beliefs about marriage, and demographics. The data are available at (https://osf.io/qtqpb/) with details about the study and contributors at our main EAMMi2 page (https://osf.io/te54b/). These data may be used to examine new research questions, provide authentic research experiences for students, and provide demonstrations for research and statistics courses

    Creative destruction in science

    Get PDF
    Drawing on the concept of a gale of creative destruction in a capitalistic economy, we argue that initiatives to assess the robustness of findings in the organizational literature should aim to simultaneously test competing ideas operating in the same theoretical space. In other words, replication efforts should seek not just to support or question the original findings, but also to replace them with revised, stronger theories with greater explanatory power. Achieving this will typically require adding new measures, conditions, and subject populations to research designs, in order to carry out conceptual tests of multiple theories in addition to directly replicating the original findings. To illustrate the value of the creative destruction approach for theory pruning in organizational scholarship, we describe recent replication initiatives re-examining culture and work morality, working parents\u2019 reasoning about day care options, and gender discrimination in hiring decisions. Significance statement It is becoming increasingly clear that many, if not most, published research findings across scientific fields are not readily replicable when the same method is repeated. Although extremely valuable, failed replications risk leaving a theoretical void\u2014 reducing confidence the original theoretical prediction is true, but not replacing it with positive evidence in favor of an alternative theory. We introduce the creative destruction approach to replication, which combines theory pruning methods from the field of management with emerging best practices from the open science movement, with the aim of making replications as generative as possible. In effect, we advocate for a Replication 2.0 movement in which the goal shifts from checking on the reliability of past findings to actively engaging in competitive theory testing and theory building. Scientific transparency statement The materials, code, and data for this article are posted publicly on the Open Science Framework, with links provided in the article

    Estimating the reproducibility of psychological science

    No full text
    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams

    Data from: Estimating the reproducibility of psychological science

    No full text
    This record contains the underlying research data for the publication "Estimating the reproducibility of psychological science" and the full-text is available from: https://ink.library.smu.edu.sg/lkcsb_research/5257Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams
    corecore