
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
AdapterHub: A Framework for Adapting Transformers
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Showing 1-25 of 390 citing articles:
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 607
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 607
Visual Prompt Tuning
Menglin Jia, Luming Tang, Bor-Chun Chen, et al.
Lecture notes in computer science (2022), pp. 709-727
Closed Access | Times Cited: 594
Menglin Jia, Luming Tang, Bor-Chun Chen, et al.
Lecture notes in computer science (2022), pp. 709-727
Closed Access | Times Cited: 594
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
Parameter-efficient fine-tuning of large-scale pre-trained language models
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 334
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 334
Aligning artificial intelligence with climate change mitigation
Lynn H. Kaack, Priya L. Donti, Emma Strubell, et al.
Nature Climate Change (2022) Vol. 12, Iss. 6, pp. 518-527
Closed Access | Times Cited: 241
Lynn H. Kaack, Priya L. Donti, Emma Strubell, et al.
Nature Climate Change (2022) Vol. 12, Iss. 6, pp. 518-527
Closed Access | Times Cited: 241
Parameter-Efficient Transfer Learning with Diff Pruning
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 156
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 156
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models
Phillip Rust, Jonas Pfeiffer, Ivan Vulić, et al.
(2021)
Open Access | Times Cited: 133
Phillip Rust, Jonas Pfeiffer, Ivan Vulić, et al.
(2021)
Open Access | Times Cited: 133
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks
Rabeeh Karimi Mahabadi, Sebastian Ruder, Mostafa Dehghani, et al.
(2021)
Open Access | Times Cited: 113
Rabeeh Karimi Mahabadi, Sebastian Ruder, Mostafa Dehghani, et al.
(2021)
Open Access | Times Cited: 113
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
Self-Supervised Pretraining Improves Self-Supervised Pretraining
Colorado Reed, Xiangyu Yue, Ani Nrusimha, et al.
2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) (2022), pp. 1050-1060
Open Access | Times Cited: 79
Colorado Reed, Xiangyu Yue, Ani Nrusimha, et al.
2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) (2022), pp. 1050-1060
Open Access | Times Cited: 79
On the Effectiveness of Parameter-Efficient Fine-Tuning
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 53
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 53
Efficient Methods for Natural Language Processing: A Survey
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 48
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 48
Data science opportunities of large language models for neuroscience and biomedicine
Danilo Bzdok, Andrew Thieme, Oleksiy Levkovskyy, et al.
Neuron (2024) Vol. 112, Iss. 5, pp. 698-717
Open Access | Times Cited: 19
Danilo Bzdok, Andrew Thieme, Oleksiy Levkovskyy, et al.
Neuron (2024) Vol. 112, Iss. 5, pp. 698-717
Open Access | Times Cited: 19
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni, G. Ramesh, Mitesh M. Khapra, et al.
ACM Computing Surveys (2025)
Open Access | Times Cited: 3
Sumanth Doddapaneni, G. Ramesh, Mitesh M. Khapra, et al.
ACM Computing Surveys (2025)
Open Access | Times Cited: 3
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Van Nguyen, Viet Dac Lai, Amir Pouran Ben Veyseh, et al.
(2021)
Open Access | Times Cited: 84
Minh Van Nguyen, Viet Dac Lai, Amir Pouran Ben Veyseh, et al.
(2021)
Open Access | Times Cited: 84
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 10186-10203
Open Access | Times Cited: 67
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 10186-10203
Open Access | Times Cited: 67
Pre-trained language models with domain knowledge for biomedical extractive summarization
Qianqian Xie, Jennifer Amy Bishop, Prayag Tiwari, et al.
Knowledge-Based Systems (2022) Vol. 252, pp. 109460-109460
Open Access | Times Cited: 65
Qianqian Xie, Jennifer Amy Bishop, Prayag Tiwari, et al.
Knowledge-Based Systems (2022) Vol. 252, pp. 109460-109460
Open Access | Times Cited: 65
Three Things Everyone Should Know About Vision Transformers
Hugo Touvron, Matthieu Cord, Alaaeldin El-Nouby, et al.
Lecture notes in computer science (2022), pp. 497-515
Closed Access | Times Cited: 63
Hugo Touvron, Matthieu Cord, Alaaeldin El-Nouby, et al.
Lecture notes in computer science (2022), pp. 497-515
Closed Access | Times Cited: 63
What to Pre-Train on? Efficient Intermediate Task Selection
Clifton Poth, Jonas Pfeiffer, Andreas Rücklé, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 62
Clifton Poth, Jonas Pfeiffer, Andreas Rücklé, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 62
Sustainable Modular Debiasing of Language Models
Anne Lauscher, Tobias Lueken, Goran Glavaš
(2021)
Open Access | Times Cited: 62
Anne Lauscher, Tobias Lueken, Goran Glavaš
(2021)
Open Access | Times Cited: 62
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer, Naman Goyal, Xi Lin, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 53
Jonas Pfeiffer, Naman Goyal, Xi Lin, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 53
UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning
Yuning Mao, Lambert Mathias, Rui Hou, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022)
Open Access | Times Cited: 48
Yuning Mao, Lambert Mathias, Rui Hou, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022)
Open Access | Times Cited: 48