614 research outputs found
Reproducibility: A Researcher-Centered Definition
Recent years have introduced major shifts in scientific reporting and publishing. The scientific community, publishers, funding agencies, and the public expect research to adhere to principles of openness, reproducibility, replicability, and repeatability. However, studies have shown that scientists often have neither the right tools nor suitable support at their disposal to meet these modern science challenges. In fact, even the concrete expectations connected to these terms may be unclear and subject to field-specific, organizational, and personal interpretations. Based on a narrative literature review of work that defines characteristics of open science, reproducibility, replicability, and repeatability, as well as a review of recent work on researcher-centered requirements, we find that the bottom-up practices and needs of researchers contrast top-down expectations encoded in terms related to reproducibility and open science. We identify and define reproducibility as a central term that concerns the ease of access to scientific resources, as well as their completeness, to the degree required for efficiently and effectively interacting with scientific work. We hope that this characterization helps to create a mutual understanding across science stakeholders, in turn paving the way for suitable and stimulating environments, fit to address the challenges of modern science reporting and publishing
Demo: Simulation-as-a-Service to Benchmark Opportunistic Networks
Repeatability, reproducibility, and replicability are essential aspects of experimental and simulation-driven research. Use of benchmarks in such evaluations further assists corroborative performance evaluations. In this work, we present a demonstrator of a simulation service, called âOPS on the benchâ which tackles these challenges in performance evaluations of opportunistic networks
Assessing the quality of land system models: moving from valibration to evaludation
Reviews suggest that evaluation of land system models is largely inadequate, with undue reliance on a vague concept of validation. Efforts to improve and standardise evaluation practices have so far had limited effect. In this article we examine the issues surrounding land system model evaluation and consider the relevance of the TRACE framework for environmental model documentation. In doing so, we discuss the application of a comprehensive range of evaluation procedures to existing models, and the value of each specific procedure. We develop a tiered checklist for going beyond what seems to be a common practice of âvalibrationâ (the repeated variation of model parameter values to achieve agreement with data) to achieving âevaludationâ (the rigorous, broad-based assessment of model quality and validity). We propose the Land Use Change â TRACE (LUC-TRACE) model evaludation protocol and argue that engagement with a comprehensive protocol of this kind (even if not this particular one) is valuable in ensuring that land system model results are interpreted appropriately. We also suggest that the main benefit of such formalised structures is to assist the process of critical thinking about model utility, and that the variety of legitimate modelling approaches precludes universal tests of whether a model is âvalidâ. Evaludation is therefore a detailed and subjective process requiring the sustained intellectual engagement of model developers and users
Modulation of renal oxygenation and perfusion in rat kidney monitored by quantitative diffusion and blood oxygen level dependent magnetic resonance imaging on a clinical 1.5T platform
ARRIVE Checklist. (DOCX 42 kb
Recommended from our members
Data Qualification Report: Calculated Porosity and Porosity-Derived Values for Lithostratigraphic Units for use on the Yucca Mountain Project
The qualification is being completed in accordance with the Data Qualification Plan DQP-NBS-GS-000006, Rev. 00 (CRWMS M&O 2001). The purpose of this data qualification activity is to evaluate for qualification the unqualified developed input and porosity output included in Data Tracking Number (DTN) M09910POROCALC.000. The main output of the analyses documented in DTN M09910POROCALC.000 is the calculated total porosity and effective porosity for 40 Yucca Mountain Project boreholes. The porosity data are used as input to Analysis Model Report (AMR) 10040, ''Rock Properties Model'' (MDL-NBS-GS-000004, Rev. 00), Interim Change Notice [ICN] 02 (CRWMS M&O 2000b). The output from the rock properties model is used as input to numerical physical-process modeling within the context of a relationship developed in the AMR between hydraulic conductivity, bound water and zeolitic zones for use in the unsaturated zone model. In accordance with procedure AP-3.15Q, the porosity output is not used in the direct calculation of Principal Factors for post-closure safety or disruptive events. The original source for DTN M09910POROCALC.000 is a Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) report, ''Combined Porosity from Geophysical Logs'' (CRWMS M&O 1999a and hereafter referred to as Rael 1999). That report recalculated porosity results for both the historical boreholes covered in Nelson (1996), and the modern boreholes reported in CRWMS M&O (1996a,b). The porosity computations in Rael (1999) are based on density-porosity mathematical relationships requiring various input parameters, including bulk density, matrix density and air and/or fluid density and volumetric water content. The main output is computed total porosity and effective porosity reported on a foot-by-foot basis for each borehole, although volumetric water content is derived from neutron data as an interim output. This qualification report uses technical assessment and corroboration to evaluate the original subject DTN. Rael (1999) provides many technical details of the technical assessment and corroboration methods and partially satisfies the intent of the qualification plan for this analysis. Rael presents a modified method based on Nelson (1996) to recompute porosity and porosity-derived values and uses some of the same inputs. Rael's (1999) intended purpose was to document porosity output relatively free of biases introduced by differing computational methods or parameter selections used for different boreholes. The qualification report necessarily evaluates the soundness of the pre-Process Validation and Re-engineering (PVAR) analyses and methodology, as reported in Rael (1999)
Considering the Development Workflow to Achieve Reproducibility with Variation
International audienceThe ability to reproduce an experiment is fundamental in computer science. Existing approaches focus on repeatability, but this is only the first step to repro-ducibility: Continuing a scientific work from a previous experiment requires to be able to modify it. This ability is called reproducibility with Variation. In this contribution, we show that capturing the environment of execution is necessary but not sufficient ; we also need the environment of development. The variation also implies that those environments are subject to evolution, so the whole software development lifecycle needs to be considered. To take into account these evolutions, software environments need to be clearly defined, reconstructible with variation, and easy to share. We propose to leverage functional package managers to achieve this goal
Practicing a Science of Security: A Philosophy of Science Perspective
Our goal is to refocus the question about cybersecurity research from 'is this process scientific' to 'why is this scientific process producing unsatisfactory results'. We focus on five common complaints that claim cybersecurity is not or cannot be scientific. Many of these complaints presume views associated with the philosophical school known as Logical Empiricism that more recent scholarship has largely modified or rejected. Modern philosophy of science, supported by mathematical modeling methods, provides constructive resources to mitigate all purported challenges to a science of security. Therefore, we argue the community currently practices a science of cybersecurity. A philosophy of science perspective suggests the following form of practice: structured observation to seek intelligible explanations of phenomena, evaluating explanations in many ways, with specialized fields (including engineering and forensics) constraining explanations within their own expertise, inter-translating where necessary. A natural question to pursue in future work is how collecting, evaluating, and analyzing evidence for such explanations is different in security than other sciences
- âŠ