LREC 2000 2nd International Conference on Language Resources & Evaluation  
Home Basic Info Archaeological Zappeion Registration Conference

Conference Papers

Program
Papers
Sessions
Abstracts
Authors
Keywords
Search

Papers by paper title: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Papers by ID number: 1-50, 51-100, 101-150, 151-200, 201-250, 251-300, 301-350, 351-377.

List of all papers and abstracts.


Previous Paper   Next Paper  

Title Dialogue and Prompting Strategies Evaluation in the DEMON System
Authors Lavelle Carine-Alexia (Institut de Recherche en Informatique de Toulouse, Universite Paul Sabatier, 118, route de Narbonne, 31062 Toulouse, France, lavelle@irit.fr)
De Calmes Martine (Institut de Recherche en Informatique, Universite Paul Sabatier, 118,route de Narbonne, 31062 Toulouse Cedex France, decalmes@irit.fr)
Perennou Guy (Institut de Recherche en Informatique, Universite Paul Sabatier, 118,route de Narbonne, 31062 Toulouse Cedex France, perennou@irit.fr)
Keywords Prompting Strategy, Spoken Dialogue Dystems, Usability
Session Session SO2 - Dialogue Evaluation Methods
Abstract In order to improve usability and efficiency of dialogue systems a major issue is of better adapting dialogue systems to intended users. This requires a good knowledge of users’ behaviour when interacting with a dialogue system. With this regard we based evaluations of dialogue and prompting strategies performed on our system on how they influence users answers. In this paper we will describe the measure we used to evaluate the effect of the size of the welcome prompt and a set measures we defined to evaluate three different confirmation strategies. We will then describe five criteria we used to evaluate system’s question complexity and their effect on users’ answers. The overall aim is to design a set of metrics that could be used to automatically decide which of the possible prompts at a given state in a dialogue should be uttered.

 

ana">