4 research outputs found

    A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review

    No full text
    <div><p>Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (<i>Nature</i>) and one that does not (<i>Cell</i>), to identify <i>in vivo</i> animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing <i>Nature</i> to <i>Cell</i> from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies.</p></div

    Outline of the study.

    No full text
    <p>(A) Selection of articles: Twenty consecutive articles that met the inclusion criteria among those published beginning in January for both 2013 and 2015 in <i>Nature</i> (one that implemented a pre-submission checklist) and <i>Cell</i> (one that did not) journals. This represents articles from periods of time before and after the implementation of the checklist in May 2013. (B) Flow of the analysis: To examine whether quality of reporting has improved over time, the degree of key information reported in 2015 was compared to that in 2013 in both journals combined (Objective 1). To assess whether a checklist is associated with improved quality in reporting, we first compared the changes over time observed in <i>Nature</i> (④ vs. ③). If there was significant difference, we compared time “2015 vs. 2013” in <i>Cell</i> (② vs. ①) and <i>Nature</i> vs. <i>Ce</i>ll within 2013 (③ vs. ①) and 2015 (④ vs. ②) to adjust for differences between journals and changes over time in reporting (Objective 2).</p

    Distribution of reporting study designs across time.

    No full text
    <p>The distributions of the reporting status are presented in stacked bar graphs. The numbers inside the stacks are the number of articles corresponding to each percentage. The data for 2013 and 2015 are the total numbers of articles assessed from <i>Cell</i> and <i>Nature</i> within a given year. Fisher exact test was performed to assess the difference in reporting each methodological across time. Significant <i>P</i> values (< 0.05) are provided.</p
    corecore