MAsked Sequence to Sequence

Year: 2,019
Journal: International Conference on Machine Learning
Languages: Chinese, English, French, German, Romanian
Programming languages: Python
Input data:

words/sentences

Output data:

words, sentences, tokens

MASS: Masked Sequence to Sequence Pre-training for Language Generation is a novel pre-training method for sequence to sequence based language generation tasks. It randomly masks a sentence fragment in the encoder, and then predicts it in the decoder.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.