31 research outputs found

    Developmental refinements in temporally precise auditory brainstem circuits

    Get PDF

    Leveraging Open Tools to Realize the Potential of Self-Archiving: A Cohort Study in Clinical Trials

    No full text
    While open access (OA) is growing, many publications remain behind a paywall. This limits the impact of research and entrenches global inequalities by restricting access to knowledge to those that can afford it. Many journal policies allow researchers to make a version of their publication openly accessible through self-archiving in a repository, sometimes after an embargo period (green OA). Unpaywall and Shareyourpaper are open tools that help users find OA articles and support authors to legally self-archive their papers, respectively. This study leveraged these tools to assess the potential of green OA to increase discoverability in a cohort of clinical trial results publications from German university medical centers. Of the 1897 publications in this cohort, 46% (n = 871/1897, 95% confidence interval (CI) 44% to 48%) were neither openly accessible via a journal or a repository. Of these, 85% (n = 736/871, 95% CI 82% to 87%) had a permission to self-archive the accepted or published version in an institutional repository. Thus, most of the closed-access clinical trial results in this cohort could be made openly accessible in a repository, in line with World Health Organization (WHO) recommendations. In addition to providing further evidence of the unrealized potential of green OA, this study demonstrates the use of open tools to obtain actionable information on self-archiving at scale and empowers efforts to increase science discoverability

    Study protocol

    No full text

    Dataset for: An institutional dashboard to drive clinical trial transparency

    No full text
    Dataset underlying a dashboard for clinical trial transparency at the level of German University Medical Centers (Data_v2 represents the most up-to-date dataset). The raw registry data associated with this dataset is openly available in Zenodo at: https://doi.org/10.5281/zenodo.7590083. The code used to generate this dataset is openly available in GitHub at: https://github.com/maia-sh/intovalue-data/releases/tag/v1.1 and https://github.com/quest-bih/clinical-dashboard

    Protocol: Reporting and quality of patient engagement: Status quo in best practice examples

    No full text
    We aim to analyze the extent and quality of patient engagement in best practice example

    Institutional dashboards on clinical trial transparency for University Medical Centers: A case study.

    Get PDF
    BackgroundUniversity Medical Centers (UMCs) must do their part for clinical trial transparency by fostering practices such as prospective registration, timely results reporting, and open access. However, research institutions are often unaware of their performance on these practices. Baseline assessments of these practices would highlight where there is room for change and empower UMCs to support improvement. We performed a status quo analysis of established clinical trial registration and reporting practices at German UMCs and developed a dashboard to communicate these baseline assessments with UMC leadership and the wider research community.Methods and findingsWe developed and applied a semiautomated approach to assess adherence to established transparency practices in a cohort of interventional trials and associated results publications. Trials were registered in ClinicalTrials.gov or the German Clinical Trials Register (DRKS), led by a German UMC, and reported as complete between 2009 and 2017. To assess adherence to transparency practices, we identified results publications associated to trials and applied automated methods at the level of registry data (e.g., prospective registration) and publications (e.g., open access). We also obtained summary results reporting rates of due trials registered in the EU Clinical Trials Register (EUCTR) and conducted at German UMCs from the EU Trials Tracker. We developed an interactive dashboard to display these results across all UMCs and at the level of single UMCs. Our study included and assessed 2,895 interventional trials led by 35 German UMCs. Across all UMCs, prospective registration increased from 33% (n = 58/178) to 75% (n = 144/193) for trials registered in ClinicalTrials.gov and from 0% (n = 0/44) to 79% (n = 19/24) for trials registered in DRKS over the period considered. Of trials with a results publication, 38% (n = 714/1,895) reported the trial registration number in the publication abstract. In turn, 58% (n = 861/1,493) of trials registered in ClinicalTrials.gov and 23% (n = 111/474) of trials registered in DRKS linked the publication in the registration. In contrast to recent increases in summary results reporting of drug trials in the EUCTR, 8% (n = 191/2,253) and 3% (n = 20/642) of due trials registered in ClinicalTrials.gov and DRKS, respectively, had summary results in the registry. Across trial completion years, timely results reporting (within 2 years of trial completion) as a manuscript publication or as summary results was 41% (n = 1,198/2,892). The proportion of openly accessible trial publications steadily increased from 42% (n = 16/38) to 74% (n = 72/97) over the period considered. A limitation of this study is that some of the methods used to assess the transparency practices in this dashboard rely on registry data being accurate and up-to-date.ConclusionsIn this study, we observed that it is feasible to assess and inform individual UMCs on their performance on clinical trial transparency in a reproducible and publicly accessible way. Beyond helping institutions assess how they perform in relation to mandates or their institutional policy, the dashboard may inform interventions to increase the uptake of clinical transparency practices and serve to evaluate the impact of these interventions

    Study materials

    No full text

    Open science Dashboard

    No full text
    We developed a digital dashboard to support biomedical institutions implementing open science. To do so, we first conducted a Delphi study to determine what Open Science practices the community valued, and in what form they wanted them measured. We reached consensus on 19 practices, of which 9 we were able to validate in a prototype dashboard. This work was funded by the Wellcome Trust: 223828/Z/21/

    Reporting of patient involvement: a mixed-methods analysis of current practice in health research publications using a targeted search strategy

    No full text
    Objectives To evaluate the extent and quality of patient involvement reporting in examples of current practice in health research.Design Mixed-methods study. We used a targeted search strategy across three cohorts to identify health research publications that reported patient involvement: original research articles published in 2019 in the British Medical Journal (BMJ), articles listed in the Patient-Centered Outcomes Research Institute (PCORI) database (2019), and articles citing the GRIPP2 (Guidance for Reporting Involvement of Patients and Public) reporting checklist for patient involvement or a critical appraisal guideline for user involvement. Publications were coded according to three coding schemes: ‘phase of involvement’, the GRIPP2-Short Form (GRIPP2-SF) reporting checklist and the critical appraisal guideline.Outcome measures The phase of the study in which patients were actively involved. For the BMJ sample, the proportion of publications that reported patient involvement. The quality of reporting based on the GRIPP2-SF reporting guideline. The quality of patient involvement based on the critical appraisal guideline. Quantitative and qualitative results are reported.Results We included 86 publications that reported patient involvement. Patients were most frequently involved in study design (90% of publications, n=77), followed by study conduct (71%, n=61) and dissemination (42%, n=36). Reporting of patient involvement was often incomplete, for example, only 40% of publications (n=34) reported the aim of patient involvement. While the methods (57%, n=49) and results (59%, n=51) of involvement were reported more frequently, reporting was often unspecific and the influence of patients’ input remained vague. Therefore, a systematic assessment of the quality and impact of patient involvement according to the critical appraisal guideline was not feasible across samples.Conclusions As patient involvement is increasingly seen as an integral part of the research process and requested by funding bodies, it is essential that researchers receive specific guidance on how to report patient involvement activities. Complete reporting builds the foundation for assessing the quality of patient involvement and its impact on research
    corecore