BETO

Year: 2,020
Journal: International Conference on Learning Representations
Languages: Spanish
Programming languages: Python
Input data:

words/sentences

In this paper we help bridge this gap by presenting a BERT-based language model pre-trained exclusively on Spanish data. As a second contribution, we also compiled several tasks specifically for the Spanish language in a single repository much in the spirit of the GLUE benchmark.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.