<p>Abstract</p> <p>Background</p> <p>Citation counts are often regarded as a measure of the utilization and contribution of published articles. The objective of this study is to assess whether statistical reporting and statistical errors in the analysis of the primary outcome are associated with the number of citations received.</p> <p>Methods</p> <p>We evaluated all original research articles published in 1996 in four psychiatric journals. The statistical and reporting quality of each paper was assessed and the number of citations received up to 2005 was obtained from the Web of Science database. We then examined whether the number of citations was associated with the quality of the statistical analysis and reporting.</p> <p>Results</p> <p>A total of 448 research papers were included in the citation analysis. Unclear or inadequate reporting of the research question and primary outcome were not statistically significantly associated with the citation counts. After adjusting for journal, extended description of statistical procedures had a positive effect on the number of citations received. Inappropriate statistical analysis did not affect the number of citations received. Adequate reporting of the primary research question, statistical methods and primary findings were all associated with the journal visibility and prestige.</p> <p>Conclusion</p> <p>In this cohort of published research, measures of reporting quality and appropriate statistical analysis were not associated with the number of citations. The journal in which a study is published appears to be as important as the statistical reporting quality in ensuring dissemination of published medical science.</p
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.