Conditional Transformer Language Model

Year: 2,019
Journal: Salesforce (Company)
Languages: Arabic, Bulgarian, Catalan, Chinese (simplified), Chinese (traditional), Croatian, Czech, Danish, Dutch, English, Estonian, Filipinot, Finnish, French, German, Greek, Hebrew, Hindih, Hungarian, Indonesian, Italian, Japanese, Korean, Latvianl, Lithuanianl, Malay, Norwegian, Persian, Polish, Portuguese, Romanian, Russian, Serbian, Slovak, Slovenians, Spanish, Swedish, Table, Thait, Tota, Turkish, Ukrainian, Vietnamese
Programming languages: Python
Input data:

text

CTRL is a 1.6 billion-parameter language model with powerful and controllable artificial text generation that can predict which subset of the training data most influenced a generated text sequence. It provides a potential method for analyzing large amounts of generated text by identifying the most influential source of training data in the model. Trained with over 50 different control codes, the CTRL model allows for better human-AI interaction because users can control the generated content and style of the text, as well as train it for multitask language generation. Finally, it can be used to improve other natural language processing (NLP) applications either through fine-tuning for a specific task or through transfer of representations that the model has learned.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.