BERT Sequence to Sequence
Year: 2,020
Journal: Association for Computational Linguistics
Languages: All Languages
Programming languages: Python
Input data:
text
In this paper, we demonstrate the efficacy of pretrained checkpoints for Sequence Generation. We developed a Transformer-based sequence-to-sequence model that is compatible with publicly available pre-trained BERT, GPT-2 and RoBERTa checkpoints.