Arabic Bidirectional Encoder Representations from Transformers
Year: 2,020
Journal: Workshop on Open-Source Arabic Corpora and Processing Tools
Languages: Arabic
Programming languages: Python
Input data:
words/sentences
Project website: https://github.com/aub-mind/arabert
In this paper, we develop an Arabic language representation model to improve the state-of-the-art in several Arabic NLU tasks. We create ARABERT based on the BERT model, a stacked Bidirectional Transformer Encoder.