Sentence Embedding

Year: 2,018
Journal: International Conference on Computational Linguistics
Languages: All Languages
Programming languages: Python
Input data:

sentence

Pooling is an essential component of a wide variety of sentence representation and embedding models. This paper explores generalized pooling methods to enhance sentence embedding. We propose vector-based multi-head attention that includes the widely used max pooling, mean pooling, and scalar self-attention as special cases. The model benefits from properly designed penalization terms to reduce redundancy in multi-head attention.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.