Abstract
Xiang Lisa Li, Percy Liang. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2021.
Keywords
Affiliated Institutions
Related Publications
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao, Adam Fisch, Danqi Chen. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natura...
On Using Very Large Target Vocabulary for Neural Machine Translation
Sébastien Jean, Kyunghyun Cho, Roland Memisevic, Yoshua Bengio. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International...
Addressing the Rare Word Problem in Neural Machine Translation
Thang Luong, Ilya Sutskever, Quoc Le, Oriol Vinyals, Wojciech Zaremba. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Intern...
mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel. Proceedings of the 2021 Conference of the North American Chapter...
Language Models as Knowledge Bases?
Fabio Petroni, Tim Rocktäschel, Sebastian Riedel, Patrick Lewis, Anton Bakhtin, Yuxiang Wu, Alexander Miller. Proceedings of the 2019 Conference on Empirical Methods in Natural ...
Publication Info
- Year
- 2021
- Type
- article
- Pages
- 4582-4597
- Citations
- 1940
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.18653/v1/2021.acl-long.353
- arXiv
- 2101.00190