16 research outputs found

    Reproducible Research and GIScience: An Evaluation Using GIScience Conference Papers

    Get PDF
    Ponencia presentada en: 11th International Conference on Geographic Information Science (GIScience 2021)GIScience conference authors and researchers face the same computational reproducibility challenges as authors and researchers from other disciplines who use computers to analyse data. Here, to assess the reproducibility of GIScience research, we apply a rubric for assessing the reproducibility of 75 conference papers published at the GIScience conference series in the years 2012-2018. Since the rubric and process were previously applied to the publications of the AGILE conference series, this paper itself is an attempt to replicate that analysis, however going beyond the previous work by evaluating and discussing proposed measures to improve reproducibility in the specific context of the GIScience conference series. The results of the GIScience paper assessment are in line with previous findings: although descriptions of workflows and the inclusion of the data and software suffice to explain the presented work, in most published papers they do not allow a third party to reproduce the results and findings with a reasonable effort. We summarise and adapt previous recommendations for improving this situation and propose the GIScience community to start a broad discussion on the reusability, quality, and openness of its research. Further, we critically reflect on the process of assessing paper reproducibility, and provide suggestions for improving future assessments

    Reproducible research and GIScience: an evaluation using AGILE conference papers

    Get PDF
    The demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geographic information science (GIScience). Using this criteria, we assessed a sample of GIScience studies from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, and we collected feedback about the assessment from the study authors. Results from the author feedback indicate that although authors support the concept of performing reproducible research, the incentives for doing this in practice are too small. Therefore, we propose concrete actions for individual researchers and the GIScience conference series to improve transparency and reproducibility. For example, to support researchers in producing reproducible work, the GIScience conference series could offer awards and paper badges, provide author guidelines for computational research, and publish articles in Open Access formats

    Reproducible Research in Geoinformatics: Concepts, Challenges and Benefits (Vision Paper)

    No full text
    Geoinformatics deals with spatial and temporal information and its analysis. Research in this field often follows established practices of first developing computational solutions for specific spatiotemporal problems and then publishing the results and insights in a (static) paper, e.g. as a PDF. Not every detail can be included in such a paper, and particularly, the complete set of computational steps are frequently left out. While this approach conveys key knowledge to other researchers it makes it difficult to effectively re-use and reproduce the reported results. In this vision paper, we propose an alternative approach to carry out and report research in Geoinformatics. It is based on (computational) reproducibility, promises to make re-use and reproduction more effective, and creates new opportunities for further research. We report on experiences with executable research compendia (ERCs) as alternatives to classic publications in Geoinformatics, and we discuss how ERCs combined with a supporting research infrastructure can transform how we do research in Geoinformatics. We point out which challenges this idea entails and what new research opportunities emerge, in particular for the COSIT community

    Reproducible research and GIScience: an evaluation using AGILE conference papers

    No full text
    The demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geographic information science (GIScience). Using this criteria, we assessed a sample of GIScience studies from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, and we collected feedback about the assessment from the study authors. Results from the author feedback indicate that although authors support the concept of performing reproducible research, the incentives for doing this in practice are too small. Therefore, we propose concrete actions for individual researchers and the GIScience conference series to improve transparency and reproducibility. For example, to support researchers in producing reproducible work, the GIScience conference series could offer awards and paper badges, provide author guidelines for computational research, and publish articles in Open Access formats

    An Adaptation of the Profile of Mood States for Use in Adults With Phenylketonuria

    No full text
    Adults with phenylketonuria (PKU) experience disturbances in mood. This study used qualitative and quantitative techniques to adapt the 65-item Profile of Mood States (POMS) for the assessment of key mood domains in adults with PKU. First, cognitive interviews on 58 POMS items (excluding 7 Friendliness domain items) among 15 adults and adolescents (age ≥16 years) with PKU were conducted to eliminate items poorly understood or considered irrelevant to PKU; 17 items were removed. Next, the remaining POMS items were quantitatively examined (Mokken scaling and Rasch analysis) in 115 adult patients with PKU. An additional 21 items were removed iteratively, resulting in the 20-item draft PKU-POMS. Finally, the psychometric properties of the draft PKU-POMS were examined. The instrument displayed strong psychometric properties (reliability, validity, and responsiveness) over 6 domains (Anxiety, Depression, Anger, Activity, Tiredness, and Confusion) and all items were well understood in the final cognitive interviews with 10 adults with PKU

    Assessing the Content Validity of the Investigator-Rated ADHD Rating Scale Version IV Instrument Inattention Subscale for Use in Adults With Phenylketonuria

    No full text
    Content validity of the 18-item Investigator-Rated Attention-Deficit Hyperactivity Disorder (ADHD) Rating Scale IV (I-ADHD RS-IV) with adult prompts was investigated using qualitative interviews of US clinicians who had prior experience rating adults with phenylketonuria (PKU) using the I-ADHD RS-IV. Fourteen qualitative interviews were conducted to obtain key symptom experiences of adults with PKU and assessed the relevance, clarity, and administration of the I-ADHD RS-IV. Participants (n = 13, 92.9%) endorsed the inattention symptoms as key experiences by adults with PKU and endorsed the instrument as fit for purpose for adults with PKU. Participants generally reported low frequencies of occurrence for the 9 I-ADHD RS-IV hyperactivity/impulsivity items. Despite some clinicians’ concerns for the lack of patient self-awareness, the participants reported no difficulty selecting a rating on these items. This in-depth study of the content validity of the I-ADHD RS-IV provides evidence that this clinician-reported instrument captures the severity of important inattention symptoms in adults with PKU
    corecore