Enhanced Representation through kNowledge IntEgration

Year: 2,019
Journal: Association for Computational Linguistics
Languages: Chinese, English
Programming languages: Python
Input data:

words/characters

We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration). Inspired by the masking strategy of BERT (Devlin et al., 2018), ERNIE is designed to learn language representation enhanced by knowledge masking strategies, which includes entity-level masking and phrase-level masking. Entity-level strategy masks entities which are usually composed of multiple words. Phrase-level strategy masks the whole phrase which is composed of several words standing together as a conceptual unit

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.