6 research outputs found

    Eleven strategies for making reproducible research and open science training the norm at research institutions

    Get PDF
    Across disciplines, researchers increasingly recognize that open science and reproducible research practices may accelerate scientific progress by allowing others to reuse research outputs and by promoting rigorous research that is more likely to yield trustworthy results. While initiatives, training programs, and funder policies encourage researchers to adopt reproducible research and open science practices, these practices are uncommon inmanyfields. Researchers need training to integrate these practicesinto their daily work. We organized a virtual brainstorming event, in collaboration with the German Reproducibility Network, to discuss strategies for making reproducible research and open science training the norm at research institutions. Here, weoutline eleven strategies, concentrated in three areas:(1)offering training, (2)adapting research assessment criteria and program requirements, and (3) building communities. We provide a brief overview of each strategy, offer tips for implementation,and provide links to resources. Our goal is toencourage members of the research community to think creatively about the many ways they can contribute and collaborate to build communities,and make reproducible research and open sciencetraining the norm. Researchers may act in their roles as scientists, supervisors, mentors, instructors, and members of curriculum, hiring or evaluation committees. Institutionalleadership and research administration andsupport staff can accelerate progress by implementing change across their institution

    Eleven strategies for making reproducible research and open science training the norm at research institutions

    Get PDF
    Across disciplines, researchers increasingly recognize that open science and reproducible research practices may accelerate scientific progress by allowing others to reuse research outputs and by promoting rigorous research that is more likely to yield trustworthy results. While initiatives, training programs, and funder policies encourage researchers to adopt reproducible research and open science practices, these practices are uncommon inmanyfields. Researchers need training to integrate these practicesinto their daily work. We organized a virtual brainstorming event, in collaboration with the German Reproducibility Network, to discuss strategies for making reproducible research and open science training the norm at research institutions. Here, weoutline eleven strategies, concentrated in three areas:(1)offering training, (2)adapting research assessment criteria and program requirements, and (3) building communities. We provide a brief overview of each strategy, offer tips for implementation,and provide links to resources. Our goal is toencourage members of the research community to think creatively about the many ways they can contribute and collaborate to build communities,and make reproducible research and open sciencetraining the norm. Researchers may act in their roles as scientists, supervisors, mentors, instructors, and members of curriculum, hiring or evaluation committees. Institutionalleadership and research administration andsupport staff can accelerate progress by implementing change across their institution

    Approaches and Criteria for Provenance in Biomedical Data Sets and Workflows: Protocol for a Scoping Review

    No full text
    BackgroundProvenance supports the understanding of data genesis, and it is a key factor to ensure the trustworthiness of digital objects containing (sensitive) scientific data. Provenance information contributes to a better understanding of scientific results and fosters collaboration on existing data as well as data sharing. This encompasses defining comprehensive concepts and standards for transparency and traceability, reproducibility, validity, and quality assurance during clinical and scientific data workflows and research. ObjectiveThe aim of this scoping review is to investigate existing evidence regarding approaches and criteria for provenance tracking as well as disclosing current knowledge gaps in the biomedical domain. This review covers modeling aspects as well as metadata frameworks for meaningful and usable provenance information during creation, collection, and processing of (sensitive) scientific biomedical data. This review also covers the examination of quality aspects of provenance criteria. MethodsThis scoping review will follow the methodological framework by Arksey and O'Malley. Relevant publications will be obtained by querying PubMed and Web of Science. All papers in English language will be included, published between January 1, 2006 and March 23, 2021. Data retrieval will be accompanied by manual search for grey literature. Potential publications will then be exported into a reference management software, and duplicates will be removed. Afterwards, the obtained set of papers will be transferred into a systematic review management tool. All publications will be screened, extracted, and analyzed: title and abstract screening will be carried out by 4 independent reviewers. Majority vote is required for consent to eligibility of papers based on the defined inclusion and exclusion criteria. Full-text reading will be performed independently by 2 reviewers and in the last step, key information will be extracted on a pretested template. If agreement cannot be reached, the conflict will be resolved by a domain expert. Charted data will be analyzed by categorizing and summarizing the individual data items based on the research questions. Tabular or graphical overviews will be given, if applicable. ResultsThe reporting follows the extension of the Preferred Reporting Items for Systematic reviews and Meta-Analyses statements for Scoping Reviews. Electronic database searches in PubMed and Web of Science resulted in 469 matches after deduplication. As of September 2021, the scoping review is in the full-text screening stage. The data extraction using the pretested charting template will follow the full-text screening stage. We expect the scoping review report to be completed by February 2022. ConclusionsInformation about the origin of healthcare data has a major impact on the quality and the reusability of scientific results as well as follow-up activities. This protocol outlines plans for a scoping review that will provide information about current approaches, challenges, or knowledge gaps with provenance tracking in biomedical sciences. International Registered Report Identifier (IRRID)DERR1-10.2196/3175

    Experiences from compiling a FAIR survey in the German Network University Medicine - Poster

    No full text
    The FAIR guiding principles for data stewardship are a set of recommendations for making research objects findable, accessible, interoperable and reusable. FAIR assessment tools implement measures for these principles and thus enable research networks to evaluate how good they comply with current standards in open and reproducible science. Based on questions from two different FAIR assessment tools, we built a tailor-made survey for the FAIR evaluation of projects within the German Network University Medicine (NUM). Established at the start of the Covid-19 pandemic outbreak, NUM addressed the need to collect and integrate Covid-19 data across German University Hospitals. Technical developments aimed to follow, among others, the FAIR principles. Interested in the actual status of FAIRness, we conducted an online survey in 2022 across German Network University Medicine projects. The goal was to identify positive examples of FAIR data in the German Network University Medicine thus to motivate other projects to take similar routes

    Eleven strategies for making reproducible research and open science training the norm at research institutions

    Get PDF
    Kohrs FE, Auer S, Bannach-Brown A, et al. Eleven strategies for making reproducible research and open science training the norm at research institutions. eLife . 2023;12: e89736.Reproducible research and open science practices have the potential to accelerate scientific progress by allowing others to reuse research outputs, and by promoting rigorous research that is more likely to yield trustworthy results. However, these practices are uncommon in many fields, so there is a clear need for training that helps and encourages researchers to integrate reproducible research and open science practices into their daily work. Here, we outline eleven strategies for making training in these practices the norm at research institutions. The strategies, which emerged from a virtual brainstorming event organized in collaboration with the German Reproducibility Network, are concentrated in three areas: (i) adapting research assessment criteria and program requirements; (ii) training; (iii) building communities. We provide a brief overview of each strategy, offer tips for implementation, and provide links to resources. We also highlight the importance of allocating resources and monitoring impact. Our goal is to encourage researchers - in their roles as scientists, supervisors, mentors, instructors, and members of curriculum, hiring or evaluation committees - to think creatively about the many ways they can promote reproducible research and open science practices in their institutions. © 2023, Kohrs et al
    corecore