Toward an objective and generic Method for Spoken Language Understanding Systems Evaluation: an extension of the DCR method


Mohamed-Zakaria KURDI (Natural Interactive Systems Laboratory (NISLab)University of Southern Denmark Main campus: Odense UniversityScience Park 10DK-5230 Odense M; Laboratoire CLIPS IMAGBP. 53380401, Grenoble cedex 09)

Mohamed AHAFHAF (Laboratoire CLIPS IMAGBP. 53380401, Grenoble cedex 09)


EP1: Evaluation


In this paper, we present an extension of the DCR method, which is a framework for the deep evaluation of Spoken Language Understanding (SLU) Systems. The key point of our contribution is the use of a linguistic typology in order to generate an evaluation corpus that covers a significant number of the linguistic phenomena we want to evaluate our system on. This allows to have more objective and deep evaluation of SLU systems.


Evaluation, Spoken language understanding systems, Parsing, DCR

Full Paper