Benchmarking ontology tools. A case study for the WebODE platform
Oscar Corcho, Raúl García-Castro, Asunción Gómez-Pérez
Ontology Group. Departamento de Inteligencia Artificial, Facultad de Informática, Universidad Politécnica de Madrid, Spain
As the Semantic Web grows the number of tools that support it increases, and a new need arises: the assessment of these tools in order to analyse whether they can deal with actual and future performance requirements. In order to evaluate ontology tools' performance, the development and use of benchmark suites for these tools is needed. In this paper we describe the design and execution of a benchmark suite for assessing the performance of the WebODE ontology engineering workbench.
benchmarking, evaluation, performance, WebODE