80 research outputs found

    Reproducible Research is like riding a bike

    Get PDF
    Reproducibility is a fundamental pillar in science but it has recently been described as hard and challenging to achieve, as stated in numerous editorials and papers, some of which alert on a “reproducibility crisis”. In this article we outline 1/ the approach taken to put Reproducible Research (RR) in the agenda of the GIScience community, 2/ first actions and initial lessons learned towards the discussion and adoption of RR principles and practices in the workflows and habits of researchers, and finally, we present 3/ our short-term strategy (two years) and specific actions to achieve the main goal of making RR an integral part of scientific workflows of the GIScience community

    Improving reproducibility of geospatial conference papers: lessons learned from a first implementation of reproducibility reviews

    Get PDF
    Ponència presentada a The 15th Munin Conference on Scholarly Publishing celebrat a Tromsø, Noruega, el 18 de novembre de 2020In an attempt to increase the reproducibility of contributions to a long-running and established geospatial conference series, the 23rd AGILE Conference on Geographic Information Science 2020 (https://agile-online.org/conference-2020) for the first time provided guidelines on preparing reproducible papers (Nüst et al., 2020) and appointed a reproducibility committee to evaluate computational workflows of accepted papers ( https://www.agile-giscience-series.net/review_process.html). Here, the committee’s members report on the lessons learned from reviewing 23 accepted full papers and outline future plans for the conference series. In summary, six submissions were partially reproduced by reproducibility reviewers, whose reports are published openly on OSF ( https://osf.io/6k5fh/). These papers are promoted with badges on the proceedings’ website (https://agile-giss.copernicus.org/articles/1/index.html). Compared to previous years’ submissions (cf. Nüst et al. 2018), the guidelines and increased community awareness markedly improved reproducibility. However, the reproduction attempts also revealed problems, most importantly insufficient documentation. This was partly mitigated by the non-blind reproducibility review, conducted after paper acceptance, where interaction between reviewers and authors can provide the input and attention needed to increase reproducibility. However, the reviews also showed that anonymisation and public repositories, when properly documented, can enable a successful reproduction without interaction, as was the case with one manuscript. Individual and organisational challenges due to the COVID-19 pandemic and the conference’s eventual cancellation increased the teething problems. Nevertheless, also under normal circumstances, future iterations will have to reduce the reviewer’s efforts to be sustainable, ideally by more readily executable workflows and a larger reproducibility committee. Furthermore, we discuss changes to the reproducibility review process and their challenges. Reproducibility reports could be made available to “regular” reviewers, or the reports could be considered equally for acceptance/rejection decisions. Insufficient information or invalid arguments for not disclosing material could then lead to a submission being rejected or not being sent out to peer review. Further organisational improvements are a publication of reviewers’ activities in public databases, making the guidelines mandatory, and collecting data on used tools/repositories, spent efforts, and communications. Finally, we summarise the revision of the guidelines, including their new section for reproducibility reviewers, and the status of the initiative “Reproducible Publications at AGILE Conferences” (https://reproducible-agile.github.io/initiative/), which we connect to related undertakings such as CODECHECK (Eglen et al., 2019). The AGILE Conference’s experiences may help other communities to transition towards more open and reproducible research publications

    Reproducible Research and GIScience: An Evaluation Using GIScience Conference Papers

    Get PDF
    Ponencia presentada en: 11th International Conference on Geographic Information Science (GIScience 2021)GIScience conference authors and researchers face the same computational reproducibility challenges as authors and researchers from other disciplines who use computers to analyse data. Here, to assess the reproducibility of GIScience research, we apply a rubric for assessing the reproducibility of 75 conference papers published at the GIScience conference series in the years 2012-2018. Since the rubric and process were previously applied to the publications of the AGILE conference series, this paper itself is an attempt to replicate that analysis, however going beyond the previous work by evaluating and discussing proposed measures to improve reproducibility in the specific context of the GIScience conference series. The results of the GIScience paper assessment are in line with previous findings: although descriptions of workflows and the inclusion of the data and software suffice to explain the presented work, in most published papers they do not allow a third party to reproduce the results and findings with a reasonable effort. We summarise and adapt previous recommendations for improving this situation and propose the GIScience community to start a broad discussion on the reusability, quality, and openness of its research. Further, we critically reflect on the process of assessing paper reproducibility, and provide suggestions for improving future assessments
    • …
    corecore