Contextualized Word Vectors

Year: 2,017
Journal: Conference on Neural Information Processing Systems
Languages: English, German
Programming languages: Python
Input data:


Output data:

word vectors (context vectors)

In this paper, we use a deep LSTM encoder from an attentional sequence-to-sequence model trained for machine translation (MT) to contextualize word vectors. We show that adding these context vectors (CoVe) improves performance over using only unsupervised word and character vectors on a wide variety of common NLP tasks

Sign In


Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.