
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
Andreas Rücklé, Gregor Geigle, Max Glockner, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 125
Showing 1-25 of 125 citing articles:
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 399
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 391
AdapterHub: A Framework for Adapting Transformers
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, et al.
(2020)
Open Access | Times Cited: 390
Parameter-efficient fine-tuning of large-scale pre-trained language models
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 334
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 334
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 93
On the Effectiveness of Parameter-Efficient Fine-Tuning
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 53
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 53
Efficient Methods for Natural Language Processing: A Survey
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 48
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 48
End-Edge-Cloud Collaborative Computing for Deep Learning: A Comprehensive Survey
Yingchao Wang, Chen Yang, Shulin Lan, et al.
IEEE Communications Surveys & Tutorials (2024) Vol. 26, Iss. 4, pp. 2647-2683
Closed Access | Times Cited: 23
Yingchao Wang, Chen Yang, Shulin Lan, et al.
IEEE Communications Surveys & Tutorials (2024) Vol. 26, Iss. 4, pp. 2647-2683
Closed Access | Times Cited: 23
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
Ruidan He, Linlin Liu, Hai Ye, et al.
(2021)
Open Access | Times Cited: 96
What to Pre-Train on? Efficient Intermediate Task Selection
Clifton Poth, Jonas Pfeiffer, Andreas Rücklé, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 62
Clifton Poth, Jonas Pfeiffer, Andreas Rücklé, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 62
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer, Naman Goyal, Xi Lin, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 53
Jonas Pfeiffer, Naman Goyal, Xi Lin, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 53
Offensive language detection in Tamil YouTube comments by adapters and cross-domain knowledge transfer
Malliga Subramanian, Rahul Ponnusamy, Sean Benhur, et al.
Computer Speech & Language (2022) Vol. 76, pp. 101404-101404
Closed Access | Times Cited: 39
Malliga Subramanian, Rahul Ponnusamy, Sean Benhur, et al.
Computer Speech & Language (2022) Vol. 76, pp. 101404-101404
Closed Access | Times Cited: 39
Parameter-Efficient Model Adaptation for Vision Transformers
Xuehai He, Chunyuan Li, Pengchuan Zhang, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 1, pp. 817-825
Open Access | Times Cited: 25
Xuehai He, Chunyuan Li, Pengchuan Zhang, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 1, pp. 817-825
Open Access | Times Cited: 25
Differentially Private Fine-tuning of Language Models
Da Yu, Saurabh Naik, Artūrs Bačkurs, et al.
Journal of Privacy and Confidentiality (2024) Vol. 14, Iss. 2
Open Access | Times Cited: 10
Da Yu, Saurabh Naik, Artūrs Bačkurs, et al.
Journal of Privacy and Confidentiality (2024) Vol. 14, Iss. 2
Open Access | Times Cited: 10
Low-Parameter Federated Learning with Large Language Models
Jingang Jiang, Haiqi Jiang, Yuhan Ma, et al.
Lecture notes in computer science (2024), pp. 319-330
Closed Access | Times Cited: 9
Jingang Jiang, Haiqi Jiang, Yuhan Ma, et al.
Lecture notes in computer science (2024), pp. 319-330
Closed Access | Times Cited: 9
MAD-G: Multilingual Adapter Generation for Efficient Cross-Lingual Transfer
Alan Ansell, Edoardo Maria Ponti, Jonas Pfeiffer, et al.
(2021), pp. 4762-4781
Open Access | Times Cited: 54
Alan Ansell, Edoardo Maria Ponti, Jonas Pfeiffer, et al.
(2021), pp. 4762-4781
Open Access | Times Cited: 54
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
Akari Asai, Mohammadreza Salehi, Matthew E. Peters, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 37
Akari Asai, Mohammadreza Salehi, Matthew E. Peters, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 37
Parameter-Efficient Transfer Learning for Remote Sensing Image–Text Retrieval
Yuan Yuan, Yang Zhan, Zhitong Xiong
IEEE Transactions on Geoscience and Remote Sensing (2023) Vol. 61, pp. 1-14
Open Access | Times Cited: 20
Yuan Yuan, Yang Zhan, Zhitong Xiong
IEEE Transactions on Geoscience and Remote Sensing (2023) Vol. 61, pp. 1-14
Open Access | Times Cited: 20
1% VS 100%: Parameter-Efficient Low Rank Adapter for Dense Predictions
Dongshuo Yin, Yiran Yang, Zhechao Wang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023), pp. 20116-20126
Closed Access | Times Cited: 18
Dongshuo Yin, Yiran Yang, Zhechao Wang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023), pp. 20116-20126
Closed Access | Times Cited: 18
AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Chin-Lun Fu, Zih-Ching Chen, Yun-Ru Lee, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 28
Chin-Lun Fu, Zih-Ching Chen, Yun-Ru Lee, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 28
xGQA: Cross-Lingual Visual Question Answering
Jonas Pfeiffer, Gregor Geigle, Aishwarya Kamath, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2022), pp. 2497-2511
Open Access | Times Cited: 27
Jonas Pfeiffer, Gregor Geigle, Aishwarya Kamath, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2022), pp. 2497-2511
Open Access | Times Cited: 27
BLOOM+1: Adding Language Support to BLOOM for Zero-Shot Prompting
Zheng Yong, Hailey Schoelkopf, Niklas Muennighoff, et al.
(2023), pp. 11682-11703
Open Access | Times Cited: 15
Zheng Yong, Hailey Schoelkopf, Niklas Muennighoff, et al.
(2023), pp. 11682-11703
Open Access | Times Cited: 15
Towards Adaptive Prefix Tuning for Parameter-Efficient Language Model Fine-tuning
Zhen-Ru Zhang, Chuanqi Tan, Haiyang Xu, et al.
(2023), pp. 1239-1248
Open Access | Times Cited: 14
Zhen-Ru Zhang, Chuanqi Tan, Haiyang Xu, et al.
(2023), pp. 1239-1248
Open Access | Times Cited: 14
LeXFiles and LegalLAMA: Facilitating English Multinational Legal Language Model Development
Ilias Chalkidis, Nicolas Garneau, Cătălina Goanță, et al.
(2023), pp. 15513-15535
Open Access | Times Cited: 14
Ilias Chalkidis, Nicolas Garneau, Cătălina Goanță, et al.
(2023), pp. 15513-15535
Open Access | Times Cited: 14