3,435 research outputs found

    Improving transparency and scientific rigor in academic publishing.

    Get PDF
    Progress in basic and clinical research is slowed when researchers fail to provide a complete and accurate report of how a study was designed, executed, and the results analyzed. Publishing rigorous scientific research involves a full description of the methods, materials, procedures, and outcomes. Investigators may fail to provide a complete description of how their study was designed and executed because they may not know how to accurately report the information or the mechanisms are not in place to facilitate transparent reporting. Here, we provide an overview of how authors can write manuscripts in a transparent and thorough manner. We introduce a set of reporting criteria that can be used for publishing, including recommendations on reporting the experimental design and statistical approaches. We also discuss how to accurately visualize the results and provide recommendations for peer reviewers to enhance rigor and transparency. Incorporating transparency practices into research manuscripts will significantly improve the reproducibility of the results by independent laboratories

    Estimating the reproducibility & transparency of smoking cessation behaviour change interventions

    Get PDF
    Introduction: Activities promoting research reproducibility and transparency are crucial for generating trustworthy evidence. Evaluation of smoking interventions is one area where vested interests may motivate reduced reproducibility and transparency. / Aims: Assess markers of transparency and reproducibility in smoking behaviour change intervention evaluation reports. Methods: One hundred evaluation reports of smoking behaviour change intervention randomised controlled trials published in 2018-2019 were identified. Reproducibility markers of pre-registration, protocol sharing, data-, materials- and analysis script-sharing, replication of a previous study and open access publication were coded in identified reports. Transparency markers of funding and conflict of interest declarations were also coded. Coding was performed by two researchers, with inter-rater reliability calculated using Krippendorff’s alpha. / Results: Seventy-one percent of reports were open access and 73% pre-registered. However, only 13% provided accessible materials, 7% accessible data and 1% accessible analysis scripts. No reports were replication studies. Ninety-four percent of reports provided a funding source statement and eighty-eight percent of reports provided a conflict of interest statement. / Conclusions: Open data, materials, analysis and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common. Future smoking research should be more reproducible to enable knowledge accumulation

    A meta-review of transparency and reproducibility-related reporting practices in published meta-analyses on clinical psychological interventions (2000–2020)

    Get PDF
    Meta-analysis is a powerful and important tool to synthesize the literature about a research topic. Like other kinds of research, meta-analyses must be reproducible to be compliant with the principles of the scientific method. Furthermore, reproducible meta-analyses can be easily updated with new data and reanalysed applying new and more refined analysis techniques. We attempted to empirically assess the prevalence of transparency and reproducibility-related reporting practices in published meta-analyses from clinical psychology by examining a random sample of 100 meta-analyses. Our purpose was to identify the key points that could be improved, with the aim of providing some recommendations for carrying out reproducible meta-analyses. We conducted a meta-review of meta-analyses of psychological interventions published between 2000 and 2020. We searched PubMed, PsycInfo and Web of Science databases. A structured coding form to assess transparency indicators was created based on previous studies and existing meta-analysis guidelines. We found major issues concerning: completely reproducible search procedures report, specification of the exact method to compute effect sizes, choice of weighting factors and estimators, lack of availability of the raw statistics used to compute the effect size and of interoperability of available data, and practically total absence of analysis script code sharing. Based on our findings, we conclude with recommendations intended to improve the transparency, openness, and reproducibility-related reporting practices of meta-analyses in clinical psychology and related areas.This research has been funded with a grant from the Ministerio de Ciencia e Innovación and by FEDER funds (Project n° PID2019-104080GB-I00)
    • …
    corecore