BERT Sequence to Sequence

Year: 2,020
Journal: Association for Computational Linguistics
Languages: All Languages
Programming languages: Python
Input data:

text

In this paper, we demonstrate the efficacy of pretrained checkpoints for Sequence Generation. We developed a Transformer-based sequence-to-sequence model that is compatible with publicly available pre-trained BERT, GPT-2 and RoBERTa checkpoints.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.