Title Language Modeling Using Dynamic Bayesian Networks
Author(s) Murat Deviren, Khalid Daoudi, Kamel Smaili


Session O35-W
Abstract In this paper we propose a new approach to language modeling based on dynamic Bayesian networks. The principle idea of our approach is to find the dependence relations between variables that represent different linguistic units (word, class, concept, ...) that constitutes a language model. In the context of this paper the linguistic units that we consider are syntactic classes and words. Our approach shou ld not be considered as a model combination technique. Rather, it is an original and coherent methodology that processes words and classes in the same model. We attempt to identify and model the dependence of words and classes on their linguistic context. Our ultimate goal is to devise an automatic mechanism that extracts the best dependence relations between a word and its context, i.e., lexical and syntactic. Preliminary results are very encouraging, in particular the model obtained with a Bayesian network where a word depends not only on previous word but also on syntactic classes of two previous words outperforms the bi-gram model.
Keyword(s) Language models, Dynamic Bayesian networks, structure learning
Language(s) French
Full Paper 640.pdf