research

Benchmarking ontology tools. A case study for the WebODE platform

Abstract

As the Semantic Web grows the number of tools that support it increases, and a new need arises: the assessment of these tools in order to analyse whether they can deal with actual and future performance requirements. In order to evaluate ontology tools’ performance, the development and use of benchmark suites for these tools is needed. In this paper we describe the design and execution of a benchmark suite for assessing the performance of the WebODE ontology engineering workbench

    Similar works