Arabic Bidirectional Encoder Representations from Transformers

Year: 2,020
Journal: Workshop on Open-Source Arabic Corpora and Processing Tools
Languages: Arabic
Programming languages: Python
Input data:

words/sentences

In this paper, we develop an Arabic language representation model to improve the state-of-the-art in several Arabic NLU tasks. We create ARABERT based on the BERT model, a stacked Bidirectional Transformer Encoder.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.