BETO
Year: 2,020
Journal: International Conference on Learning Representations
Languages: Spanish
Programming languages: Python
Input data:
words/sentences
Project website: https://github.com/dccuchile/beto
In this paper we help bridge this gap by presenting a BERT-based language model pre-trained exclusively on Spanish data. As a second contribution, we also compiled several tasks specifically for the Spanish language in a single repository much in the spirit of the GLUE benchmark.