Summary of the paper

Title Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities
Authors Steffen Remus and Chris Biemann
Abstract Standard word embeddings lack the possibility to distinguish senses of a word by projecting them to exactly one vector. This has a negative effect particularly when computing similarity scores between words using standard vector-based similarity measures such as cosine similarity. We argue that minor senses play an important role in word similarity computations, hence we use an unsupervised sense inventory resource to retrofit monolingual word embeddings, producing sense-aware embeddings. Using retrofitted sense-aware embeddings, we show improved word similarity and relatedness results on multiple word embeddings and multiple established word similarity tasks, sometimes up to an impressive margin of 0.15 Spearman correlation score.
Topics Knowledge Discovery/Representation, Word Sense Disambiguation, Semantics
Full paper Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities
Bibtex @InProceedings{REMUS18.290,
  author = {Steffen Remus and Chris Biemann},
  title = "{Retrofitting Word Representations for Unsupervised Sense Aware Word Similarities}",
  booktitle = {Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
  year = {2018},
  month = {May 7-12, 2018},
  address = {Miyazaki, Japan},
  editor = {Nicoletta Calzolari (Conference chair) and Khalid Choukri and Christopher Cieri and Thierry Declerck and Sara Goggi and Koiti Hasida and Hitoshi Isahara and Bente Maegaard and Joseph Mariani and Hélène Mazo and Asuncion Moreno and Jan Odijk and Stelios Piperidis and Takenobu Tokunaga},
  publisher = {European Language Resources Association (ELRA)},
  isbn = {979-10-95546-00-9},
  language = {english}
  }
Powered by ELDA © 2018 ELDA/ELRA