3 research outputs found

    UTPB: A Benchmark for Scientific Workflow Provenance Storage and Querying Systems

    Get PDF
    A crucial challenge for scientific workflow management systems is to support the efficient and scalable storage and querying of large provenance datasets that record the history of in silico experiments. As new provenance management systems are being developed, it is important to have benchmarks that can evaluate these systems and provide an unbiased comparison. In this paper, based on the requirements for scientific workflow provenance systems, we design an extensible benchmark that features a collection of techniques and tools for workload generation, query selection, performance measurement, and experimental result interpretation

    Web ontology reasoning with logic databases [online]

    Get PDF

    Generating Random Benchmarks for Description Logics

    No full text
    this paper, we address the problem of generatin
    corecore