5 research outputs found
Journals’ instructions to authors: A cross-sectional study across scientific disciplines
In light of increasing calls for transparent reporting of research and prevention of detrimental research practices, we conducted a cross-sectional machine-assisted analysis of a representative sample of scientific journals’ instructions to authors (ItAs) across all disciplines. We investigated addressing of 19 topics related to transparency in reporting and research integrity. Only three topics were addressed in more than one third of ItAs: conflicts of interest, plagiarism, and the type of peer review the journal employs. Health and Life Sciences journals, journals published by medium or large publishers, and journals registered in the Directory of Open Access Journals (DOAJ) were more likely to address many of the analysed topics, while Arts & Humanities journals were least likely to do so. Despite the recent calls for transparency and integrity in research, our analysis shows that most scientific journals need to update their ItAs to align them with practices which prevent detrimental research practices and ensure transparent reporting of research
Transparency in conducting and reporting research: A survey of authors, reviewers, and editors across scholarly disciplines
Calls have been made for improving transparency in conducting and reporting research, improving work climates, and preventing detrimental research practices. To assess attitudes and practices regarding these topics, we sent a survey to authors, reviewers, and editors. We received 3,659 (4.9%) responses out of 74,749 delivered emails. We found no significant differences between authors’, reviewers’, and editors’ attitudes towards transparency in conducting and reporting research, or towards their perceptions of work climates. Undeserved authorship was perceived by all groups as the most prevalent detrimental research practice, while fabrication, falsification, plagiarism, and not citing prior relevant research, were seen as more prevalent by editors than authors or reviewers. Overall, 20% of respondents admitted sacrificing the quality of their publications for quantity, and 14% reported that funders interfered in their study design or reporting. While survey respondents came from 126 different countries, due to the survey’s overall low response rate our results might not necessarily be generalizable. Nevertheless, results indicate that greater involvement of all stakeholders is needed to align actual practices with current recommendations
Systematic review and meta-analyses of studies analysing instructions to authors from 1987 to 2017
To gain insight into changes of scholarly journals’ recommendations, we conducted a systematic review of studies that analysed journals’ Instructions to Authors (ItAs). We summarised results of 153 studies, and meta-analysed how often ItAs addressed: 1) authorship, 2) conflicts of interest, 3) data sharing, 4) ethics approval, 5) funding disclosure, and 6) International Committee of Medical Journal Editors’ Uniform Requirements for Manuscripts. For each topic we found large between-study heterogeneity. Here, we show six factors that explained most of that heterogeneity: 1) time (addressing of topics generally increased over time), 2) country (large differences found between countries), 3) database indexation (large differences found between databases), 4) impact factor (topics were more often addressed in highest than in lowest impact factor journals), 5) discipline (topics were more often addressed in Health Sciences than in other disciplines), and 6) sub-discipline (topics were more often addressed in general than in sub-disciplinary journals)
Recommended from our members
The FAIR Guiding Principles for scientific data management and stewardship.
There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders-representing academia, industry, funding agencies, and scholarly publishers-have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community