Cross-lingual language model

Year: 2,019
Journal: Conference on Neural Information Processing Systems
Languages: Arabic, Bulgarian, Chinese, English, French, German, Greek, Hindi, Russian, Spanish, Swahili, Thai, Turkish, Urdu, Vietnamese
Programming languages: Python
Input data:

sentences

We propose two methods to learn cross-lingual language models (XLMs): one unsupervised that only relies on monolingual data, and one supervised that leverages parallel data with a new cross-lingual language model objective.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.