English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

Using Multi-Sense Vector Embeddings for Reverse Dictionaries

MPS-Authors
/persons/resource/persons206666

Yates,  Andrew
Databases and Information Systems, MPI for Informatics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)

arXiv:1904.01451.pdf
(Preprint), 309KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Hedderich, M. A., Yates, A., Klakow, D., & de Melo, G. (2019). Using Multi-Sense Vector Embeddings for Reverse Dictionaries. Retrieved from http://arxiv.org/abs/1904.01451.


Cite as: http://hdl.handle.net/21.11116/0000-0004-02B4-E
Abstract
Popular word embedding methods such as word2vec and GloVe assign a single vector representation to each word, even if a word has multiple distinct meanings. Multi-sense embeddings instead provide different vectors for each sense of a word. However, they typically cannot serve as a drop-in replacement for conventional single-sense embeddings, because the correct sense vector needs to be selected for each word. In this work, we study the effect of multi-sense embeddings on the task of reverse dictionaries. We propose a technique to easily integrate them into an existing neural network architecture using an attention mechanism. Our experiments demonstrate that large improvements can be obtained when employing multi-sense embeddings both in the input sequence as well as for the target representation. An analysis of the sense distributions and of the learned attention is provided as well.