mBART

Year: 2,020
Journal: Association for Computational Linguistics
Languages: Arabic, Burmese, Chinese (simplified), Czech, Dutch, English, Estonian, Finnish, French, German, Gujarati, Hindi, Italian, Japanese, Kazakh, Korean, Latvian, Lithuanian, Nepali, Romanian, Russian, Sinhala, Spanish, Turkish, Vietnamese
Programming languages: Python
Input data:

text/sentence

Output data:

text (translation?)

MBART is a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART objective. mBART is one of the first methods for pre-training a complete sequence-to-sequence model by denoising full texts in multiple languages, while previous approaches have focused only on the encoder, decoder, or reconstructing parts of the text.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.