Reformer

Year: 2,020
Journal: International Conference on Learning Representations
Languages: All Languages
Programming languages: Python
Input data:

text/sentences

Large Transformer models routinely achieve state-of-the-art results on a number of tasks but training these models can be prohibitively costly, especially on long sequences. We introduce two techniques to improve the efficiency of Transformers.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.