Language model augmented sequence taggers
Year: 2,017
Journal: Annual Meeting of the Association for Computational Linguistics Allen Institute for Artificial Intelligence
Languages: English
Programming languages: Python
Input data:
words/sentences
Output data:
IOB-format (inside-outside-beginning), words as vectors
A general semi-supervised approach for adding pretrained context embeddings from bidirectional language models to NLP systems and apply it to sequence labeling tasks