
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
Showing 1-25 of 399 citing articles:
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester, Rami Al‐Rfou, Noah Constant
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 1648
Brian Lester, Rami Al‐Rfou, Noah Constant
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 1648
Efficient Transformers: A Survey
Yi Tay, Mostafa Dehghani, Dara Bahri, et al.
ACM Computing Surveys (2022) Vol. 55, Iss. 6, pp. 1-28
Open Access | Times Cited: 610
Yi Tay, Mostafa Dehghani, Dara Bahri, et al.
ACM Computing Surveys (2022) Vol. 55, Iss. 6, pp. 1-28
Open Access | Times Cited: 610
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 607
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 607
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
AdapterHub: A Framework for Adapting Transformers
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Multi-Concept Customization of Text-to-Image Diffusion
Nupur Kumari, Bingliang Zhang, Richard Zhang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023), pp. 1931-1941
Open Access | Times Cited: 291
Nupur Kumari, Bingliang Zhang, Richard Zhang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023), pp. 1931-1941
Open Access | Times Cited: 291
From Zero to Hero: On the Limitations of Zero-Shot Language Transfer with Multilingual Transformers
Anne Lauscher, Vinit Ravishankar, Ivan Vulić, et al.
(2020)
Open Access | Times Cited: 234
Anne Lauscher, Vinit Ravishankar, Ivan Vulić, et al.
(2020)
Open Access | Times Cited: 234
Self-Alignment Pretraining for Biomedical Entity Representations
Fangyu Liu, Ehsan Shareghi, Zaiqiao Meng, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
Open Access | Times Cited: 221
Fangyu Liu, Ehsan Shareghi, Zaiqiao Meng, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
Open Access | Times Cited: 221
Probing Pretrained Language Models for Lexical Semantics
Ivan Vulić, Edoardo Maria Ponti, Robert Litschko, et al.
(2020)
Open Access | Times Cited: 173
Ivan Vulić, Edoardo Maria Ponti, Robert Litschko, et al.
(2020)
Open Access | Times Cited: 173
Parameter-Efficient Transfer Learning with Diff Pruning
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 156
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 156
Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter
Wei Liu, Xiyan Fu, Yue Zhang, et al.
(2021)
Open Access | Times Cited: 134
Wei Liu, Xiyan Fu, Yue Zhang, et al.
(2021)
Open Access | Times Cited: 134
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust, Jonas Pfeiffer, Ivan Vulić, et al.
(2021)
Open Access | Times Cited: 133
Phillip Rust, Jonas Pfeiffer, Ivan Vulić, et al.
(2021)
Open Access | Times Cited: 133
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
MasakhaNER: Named Entity Recognition for African Languages
David Ifeoluwa Adelani, Jade Abbott, Graham Neubig, et al.
Transactions of the Association for Computational Linguistics (2021) Vol. 9, pp. 1116-1131
Open Access | Times Cited: 106
David Ifeoluwa Adelani, Jade Abbott, Graham Neubig, et al.
Transactions of the Association for Computational Linguistics (2021) Vol. 9, pp. 1116-1131
Open Access | Times Cited: 106
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models
Zhiqiang Hu, Lei Wang, Yihuai Lan, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023)
Open Access | Times Cited: 89
Zhiqiang Hu, Lei Wang, Yihuai Lan, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023)
Open Access | Times Cited: 89
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni, G. Ramesh, Mitesh M. Khapra, et al.
ACM Computing Surveys (2025)
Open Access | Times Cited: 3
Sumanth Doddapaneni, G. Ramesh, Mitesh M. Khapra, et al.
ACM Computing Surveys (2025)
Open Access | Times Cited: 3
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Edoardo Maria Ponti, Goran Glavaš, Olga Majewska, et al.
(2020)
Open Access | Times Cited: 124
Edoardo Maria Ponti, Goran Glavaš, Olga Majewska, et al.
(2020)
Open Access | Times Cited: 124
Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning
Runxin Xu, Fuli Luo, Zhiyuan Zhang, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 105
Runxin Xu, Fuli Luo, Zhiyuan Zhang, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 105
UDapter: Language Adaptation for Truly Universal Dependency Parsing
Ahmet Üstün, Arianna Bisazza, Gosse Bouma, et al.
(2020)
Open Access | Times Cited: 98
Ahmet Üstün, Arianna Bisazza, Gosse Bouma, et al.
(2020)
Open Access | Times Cited: 98
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
Sebastian Ruder, Noah Constant, Jan A. Botha, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 10215-10245
Open Access | Times Cited: 98
Sebastian Ruder, Noah Constant, Jan A. Botha, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 10215-10245
Open Access | Times Cited: 98
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
Multilingual Speech Translation from Efficient Finetuning of Pretrained Models
Xian Li, Changhan Wang, Yun Tang, et al.
(2021)
Open Access | Times Cited: 94
Xian Li, Changhan Wang, Yun Tang, et al.
(2021)
Open Access | Times Cited: 94
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Van Nguyen, Viet Dac Lai, Amir Pouran Ben Veyseh, et al.
(2021)
Open Access | Times Cited: 84
Minh Van Nguyen, Viet Dac Lai, Amir Pouran Ben Veyseh, et al.
(2021)
Open Access | Times Cited: 84
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester, Rami Al‐Rfou, Noah Constant
arXiv (Cornell University) (2021)
Closed Access | Times Cited: 70
Brian Lester, Rami Al‐Rfou, Noah Constant
arXiv (Cornell University) (2021)
Closed Access | Times Cited: 70