Contextualized Word Vectors
Year: 2,017
Journal: Conference on Neural Information Processing Systems
Languages: English, German
Programming languages: Python
Input data:
sentences
Output data:
word vectors (context vectors)
Project website: https://github.com/salesforce/cove
In this paper, we use a deep LSTM encoder from an attentional sequence-to-sequence model trained for machine translation (MT) to contextualize word vectors. We show that adding these context vectors (CoVe) improves performance over using only unsupervised word and character vectors on a wide variety of common NLP tasks