4 research outputs found
Sharing and Preserving Computational Analyses for Posterity with encapsulator
Open data and open-source software may be part of the solution to science's
"reproducibility crisis", but they are insufficient to guarantee
reproducibility. Requiring minimal end-user expertise, encapsulator creates a
"time capsule" with reproducible code in a self-contained computational
environment. encapsulator provides end-users with a fully-featured desktop
environment for reproducible research.Comment: 11 pages, 6 figure
A Backend Platform for Supporting the Reproducibility of Computational Experiments
In recent years, the research community has raised serious questions about
the reproducibility of scientific work. In particular, since many studies
include some kind of computing work, reproducibility is also a technological
challenge, not only in computer science, but in most research domains.
Replicability and computational reproducibility are not easy to achieve, not
only because researchers have diverse proficiency in computing technologies,
but also because of the variety of computational environments that can be used.
Indeed, it is challenging to recreate the same environment using the same
frameworks, code, data sources, programming languages, dependencies, and so on.
In this work, we propose an Integrated Development Environment allowing the
share, configuration, packaging and execution of an experiment by setting the
code and data used and defining the programming languages, code, dependencies,
databases, or commands to execute to achieve consistent results for each
experiment. After the initial creation and configuration, the experiment can be
executed any number of times, always producing exactly the same results.
Furthermore, it allows the execution of the experiment by using a different
associated dataset, and it can be possible to verify the reproducibility and
replicability of the results. This allows the creation of a reproducible pack
that can be re-executed by anyone on any other computer. Our platform aims to
allow researchers in any field to create a reproducibility package for their
science that can be re-executed on any other computer.
To evaluate our platform, we used it to reproduce 25 experiments extracted
from published papers. We have been able to successfully reproduce 20 (80%) of
these experiments achieving the results reported in such works with minimum
effort, thus showing that our approach is effective