15 research outputs found
Biomedical supervisors' role modeling of open science practices
Supervision is one important way to socialize Ph.D. candidates into open and responsible research. We hypothesized that one should be more likely to identify open science practices (here publishing open access and sharing data) in empirical publications that were part of a Ph.D. thesis when the Ph.D. candidates' supervisors engaged in these practices compared to those whose supervisors did not or less often did. Departing from thesis repositories at four Dutch University Medical centers, we included 211 pairs of supervisors and Ph.D. candidates, resulting in a sample of 2062 publications. We determined open access status using UnpaywallR and Open Data using Oddpub, where we also manually screened publications with potential open data statements. Eighty-three percent of our sample was published openly, and 9% had open data statements. Having a supervisor who published open access more often than the national average was associated with an odds of 1.99 to publish open access. However, this effect became nonsignificant when correcting for institutions. Having a supervisor who shared data was associated with 2.22 (CI:1.19-4.12) times the odds to share data compared to having a supervisor that did not. This odds ratio increased to 4.6 (CI:1.86-11.35) after removing false positives. The prevalence of open data in our sample was comparable to international studies; open access rates were higher. Whilst Ph.D. candidates spearhead initiatives to promote open science, this study adds value by investigating the role of supervisors in promoting open science
Preregistering Qualitative Research: A Delphi Study
Preregistrations—records made a priori about study designs and analysis plans and placed in open repositories—are thought to strengthen the credibility and transparency of research. Different authors have put forth arguments in favor of introducing this practice in qualitative research and made suggestions for what to include in a qualitative preregistration form. The goal of this study was to gauge and understand what parts of preregistration templates qualitative researchers would find helpful and informative. We used an online Delphi study design consisting of two rounds with feedback reports in between. In total, 48 researchers participated (response rate: 16%). In round 1, panelists considered 14 proposed items relevant to include in the preregistration form, but two items had relevance scores just below our predefined criterion (68%) with mixed argument and were put forth again. We combined items where possible, leading to 11 revised items. In round 2, panelists agreed on including the two remaining items. Panelists also converged on suggested terminology and elaborations, except for two terms for which they provided clear arguments. The result is an agreement-based form for the preregistration of qualitative studies that consists of 13 items. The form will be made available as a registration option on Open Science Framework (osf.io). We believe it is important to assure that the strength of qualitative research, which is its flexibility to adapt, adjust and respond, is not lost in preregistration. The preregistration should provide a systematic starting point
Eleven strategies for making reproducible research and open science training the norm at research institutions
Across disciplines, researchers increasingly recognize that open science and reproducible research practices may accelerate scientific progress by allowing others to reuse research outputs and by promoting rigorous research that is more likely to yield trustworthy results. While initiatives, training programs, and funder policies encourage researchers to adopt reproducible research and open science practices, these practices are uncommon inmanyfields. Researchers need training to integrate these practicesinto their daily work. We organized a virtual brainstorming event, in collaboration with the German Reproducibility Network, to discuss strategies for making reproducible research and open science training the norm at research institutions. Here, weoutline eleven strategies, concentrated in three areas:(1)offering training, (2)adapting research assessment criteria and program requirements, and (3) building communities. We provide a brief overview of each strategy, offer tips for implementation,and provide links to resources. Our goal is toencourage members of the research community to think creatively about the many ways they can contribute and collaborate to build communities,and make reproducible research and open sciencetraining the norm. Researchers may act in their roles as scientists, supervisors, mentors, instructors, and members of curriculum, hiring or evaluation committees. Institutionalleadership and research administration andsupport staff can accelerate progress by implementing change across their institution
Eleven strategies for making reproducible research and open science training the norm at research institutions
Across disciplines, researchers increasingly recognize that open science and reproducible research practices may accelerate scientific progress by allowing others to reuse research outputs and by promoting rigorous research that is more likely to yield trustworthy results. While initiatives, training programs, and funder policies encourage researchers to adopt reproducible research and open science practices, these practices are uncommon inmanyfields. Researchers need training to integrate these practicesinto their daily work. We organized a virtual brainstorming event, in collaboration with the German Reproducibility Network, to discuss strategies for making reproducible research and open science training the norm at research institutions. Here, weoutline eleven strategies, concentrated in three areas:(1)offering training, (2)adapting research assessment criteria and program requirements, and (3) building communities. We provide a brief overview of each strategy, offer tips for implementation,and provide links to resources. Our goal is toencourage members of the research community to think creatively about the many ways they can contribute and collaborate to build communities,and make reproducible research and open sciencetraining the norm. Researchers may act in their roles as scientists, supervisors, mentors, instructors, and members of curriculum, hiring or evaluation committees. Institutionalleadership and research administration andsupport staff can accelerate progress by implementing change across their institution
Preregistering qualitative research
The threat to reproducibility and awareness of current rates of research misbehavior sparked initiatives to better academic science. One initiative is preregistration of quantitative research. We investigate whether the preregistration format could also be used to boost the credibility of qualitative research. A crucial distinction underlying preregistration is that between prediction and postdiction. In qualitative research, data are used to decide which way interpretation should move forward, using data to generate hypotheses and new research questions. Qualitative research is thus a real-life example of postdiction research. Some may object to the idea of preregistering qualitative studies because qualitative research generally does not test hypotheses, and because qualitative research design is typically flexible and subjective. We rebut these objections, arguing that making hypotheses explicit is just one feature of preregistration, that flexibility can be tracked using preregistration, and that preregistration would provide a check on subjectivity. We then contextualize preregistrations alongside another initiative to enhance credibility in qualitative research: the confirmability audit. Besides, preregistering qualitative studies is practically useful to combating dissemination bias and could incentivize qualitative researchers to report constantly on their study's development. We conclude with suggested modifications to the Open Science Framework preregistration form to tailor it for qualitative studies
Perceived publication pressure in Amsterdam: Survey of all disciplinary fields and academic ranks
Publications determine to a large extent the possibility to stay in academia (“publish or perish”). While some pressure to publish may incentivise high quality research, too much publication pressure is likely to have detrimental effects on both the scientific enterprise and on individual researchers. Our research question was: What is the level of perceived publication pressure in the four academic institutions in Amsterdam and does the pressure to publish differ between academic ranks and disciplinary fields? Investigating researchers in Amsterdam with the revised Publication Pressure Questionnaire, we find that a negative attitude towards the current publication climate is present across academic ranks and disciplinary fields. Postdocs and assistant professors (M = 3.42) perceive the greatest publication stress and PhD-students (M = 2.44) perceive a significant lack of resources to relieve publication stress. Results indicate the need for a healthier publication climate where the quality and integrity of research is rewarded
Perceptions of research integrity climate differ between academic ranks and disciplinary fields: Results from a survey among academic researchers in Amsterdam
Breaches of research integrity have shocked the academic community. Initially explanations were sought at the level of individual researchers but over time increased recognition emerged of the important role that the research integrity climate may play in influencing researchers' (mis)behavior. In this study we aim to assess whether researchers from different academic ranks and disciplinary fields experience the research integrity climate differently. We sent an online questionnaire to academic researchers in Amsterdam using the Survey of Organizational Research Climate. Bonferroni corrected mean differences showed that junior researchers (PhD students, postdocs and assistant professors) perceive the research integrity climate more negatively than senior researchers (associate and full professors). Junior researchers note that their supervisors are less committed to talk about key research integrity principles compared to senior researchers (MD = -.39, CI = -.55, -.24). PhD students perceive more competition and suspicion among colleagues (MD = -.19, CI = -.35, -.05) than associate and full professors. We found that researchers from the natural sciences overall express a more positive perception of the research integrity climate. Researchers from social sciences as well as from the humanities perceive less fairness of their departments' expectations in terms of publishing and acquiring funding compared to natural sciences and biomedical sciences (MD = -.44, CI = -.74, -.15; MD = -.36, CI = -.61, -.11). Results suggest that department leaders in the humanities and social sciences should do more to set fairer expectations for their researchers and that senior scientists should ensure junior researchers are socialized into research integrity practices and foster a climate in their group where suspicion among colleagues has no place
Personally perceived publication pressure: revising the Publication Pressure Questionnaire (PPQ) by using work stress models
Abstract Background The emphasis on impact factors and the quantity of publications intensifies competition between researchers. This competition was traditionally considered an incentive to produce high-quality work, but there are unwanted side-effects of this competition like publication pressure. To measure the effect of publication pressure on researchers, the Publication Pressure Questionnaire (PPQ) was developed. Upon using the PPQ, some issues came to light that motivated a revision. Method We constructed two new subscales based on work stress models using the facet method. We administered the revised PPQ (PPQr) to a convenience sample together with the Maslach Burnout Inventory (MBI) and the Work Design Questionnaire (WDQ). To assess which items best measured publication pressure, we carried out a principal component analysis (PCA). Reliability was sufficient when Cronbach’s alpha > 0.7. Finally, we administered the PPQr in a larger, independent sample of researchers to check the reliability of the revised version. Results Three components were identified as ‘stress’, ‘attitude’, and ‘resources’. We selected 3 × 6 = 18 items with high loadings in the three-component solution. Based on the convenience sample, Cronbach’s alphas were 0.83 for stress, 0.80 for attitude, and 0.76 for resources. We checked the validity of the PPQr by inspecting the correlations with the MBI and the WDQ. Stress correlated 0.62 with MBI’s emotional exhaustion. Resources correlated 0.50 with relevant WDQ subscales. To assess the internal structure of the PPQr in the independent reliability sample, we conducted the principal component analysis. The three-component solution explains 50% of the variance. Cronbach’s alphas were 0.80, 0.78, and 0.75 for stress, attitude, and resources, respectively. Conclusion We conclude that the PPQr is a valid and reliable instrument to measure publication pressure in academic researchers from all disciplinary fields. The PPQr strongly relates to burnout and could also be beneficial for policy makers and research institutions to assess the degree of publication pressure in their institute