
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
IDPG: An Instance-Dependent Prompt Generation Method
Zhuofeng Wu, Sinong Wang, Jiatao Gu, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 27
Zhuofeng Wu, Sinong Wang, Jiatao Gu, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 27
Showing 1-25 of 27 citing articles:
Active Example Selection for In-Context Learning
Yiming Zhang, Feng Shi, Chenhao Tan
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 49
Yiming Zhang, Feng Shi, Chenhao Tan
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 49
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
Akari Asai, Mohammadreza Salehi, Matthew E. Peters, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 37
Akari Asai, Mohammadreza Salehi, Matthew E. Peters, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 37
Overview of the PromptCBLUE Shared Task in CHIP2023
Wei Zhu, Xiaoling Wang, Mosha Chen, et al.
Communications in computer and information science (2024), pp. 3-20
Closed Access | Times Cited: 5
Wei Zhu, Xiaoling Wang, Mosha Chen, et al.
Communications in computer and information science (2024), pp. 3-20
Closed Access | Times Cited: 5
Federated Learning of Large Language Models with Parameter-Efficient Prompt Tuning and Adaptive Optimization
Tianshi Che, Ji Liu, Yang Zhou, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023)
Open Access | Times Cited: 13
Tianshi Che, Ji Liu, Yang Zhou, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023)
Open Access | Times Cited: 13
An empirical study on impact of label noise on synthetic tabular data generation
Jeong-Hoon Kim, Chao Huang, Xin Liu
Machine Learning (2025) Vol. 114, Iss. 4
Open Access
Jeong-Hoon Kim, Chao Huang, Xin Liu
Machine Learning (2025) Vol. 114, Iss. 4
Open Access
STP: Special token prompt for parameter-efficient tuning of pre-trained language models
Yaoyao Yan, Hui Yu, Da Wang, et al.
Expert Systems with Applications (2025), pp. 127665-127665
Closed Access
Yaoyao Yan, Hui Yu, Da Wang, et al.
Expert Systems with Applications (2025), pp. 127665-127665
Closed Access
Parameter-efficient fine-tuning in large language models: a survey of methodologies
Luping Wang, Sheng Chen, Linnan Jiang, et al.
Artificial Intelligence Review (2025) Vol. 58, Iss. 8
Open Access
Luping Wang, Sheng Chen, Linnan Jiang, et al.
Artificial Intelligence Review (2025) Vol. 58, Iss. 8
Open Access
PromptGen: Automatically Generate Prompts using Generative Models
Yue Zhang, Hongliang Fei, Dingcheng Li, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 10
Yue Zhang, Hongliang Fei, Dingcheng Li, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 10
Unified Prompt Learning Makes Pre-Trained Language Models Better Few-Shot Learners
Feihu Jin, Jinliang Lu, Jiajun Zhang
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2023), pp. 1-5
Closed Access | Times Cited: 5
Feihu Jin, Jinliang Lu, Jiajun Zhang
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2023), pp. 1-5
Closed Access | Times Cited: 5
FLamE: Few-shot Learning from Natural Language Explanations
Yangqiaoyu Zhou, Yiming Zhang, Chenhao Tan
(2023), pp. 6743-6763
Open Access | Times Cited: 5
Yangqiaoyu Zhou, Yiming Zhang, Chenhao Tan
(2023), pp. 6743-6763
Open Access | Times Cited: 5
Late Prompt Tuning: A Late Prompt Could Be Better Than Many Prompts
Xiangyang Liu, Tianxiang Sun, Xuanjing Huang, et al.
(2022), pp. 1325-1338
Open Access | Times Cited: 7
Xiangyang Liu, Tianxiang Sun, Xuanjing Huang, et al.
(2022), pp. 1325-1338
Open Access | Times Cited: 7
Learned Adapters Are Better Than Manually Designed Adapters
Yuming Zhang, Peng Wang, Ming Tan, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023), pp. 7420-7437
Open Access | Times Cited: 4
Yuming Zhang, Peng Wang, Ming Tan, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023), pp. 7420-7437
Open Access | Times Cited: 4
SPT: Learning to Selectively Insert Prompts for Better Prompt Tuning
Wei Zhu, Ming Tan
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 11862-11878
Open Access | Times Cited: 4
Wei Zhu, Ming Tan
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 11862-11878
Open Access | Times Cited: 4
Automatic Design of Adapter Architectures for Enhanced Parameter-Efficient Fine-Tuning
Siya Xu, Xinyan Wen
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2024), pp. 12536-12540
Closed Access | Times Cited: 1
Siya Xu, Xinyan Wen
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2024), pp. 12536-12540
Closed Access | Times Cited: 1
Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models
Mingjun Zhou, Zhuoma Daiqing, Nuo Qun, et al.
Lecture notes in computer science (2024), pp. 410-422
Closed Access | Times Cited: 1
Mingjun Zhou, Zhuoma Daiqing, Nuo Qun, et al.
Lecture notes in computer science (2024), pp. 410-422
Closed Access | Times Cited: 1
Vector-Quantized Input-Contextualized Soft Prompts for Natural Language Understanding
Rishabh Bhardwaj, Amrita Saha, Steven C. H. Hoi, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022), pp. 6776-6791
Open Access | Times Cited: 5
Rishabh Bhardwaj, Amrita Saha, Steven C. H. Hoi, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022), pp. 6776-6791
Open Access | Times Cited: 5
Hierarchical Prompt Tuning for Few-Shot Multi-Task Learning
Jingping Liu, Tao Chen, Zujie Liang, et al.
(2023), pp. 1556-1565
Closed Access | Times Cited: 2
Jingping Liu, Tao Chen, Zujie Liang, et al.
(2023), pp. 1556-1565
Closed Access | Times Cited: 2
SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts
JoonâYoung Choi, Junho Kim, Jun-Hyung Park, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 14306-14316
Open Access | Times Cited: 2
JoonâYoung Choi, Junho Kim, Jun-Hyung Park, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 14306-14316
Open Access | Times Cited: 2
Adapter Tuning With Task-Aware Attention Mechanism
Jinliang Lu, Feihu Jin, Jiajun Zhang
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2023), pp. 1-5
Closed Access | Times Cited: 1
Jinliang Lu, Feihu Jin, Jiajun Zhang
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2023), pp. 1-5
Closed Access | Times Cited: 1
Attribute Controlled Dialogue Prompting
Runcheng Liu, Ahmad Rashid, Ivan Kobyzev, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023), pp. 2380-2389
Open Access | Times Cited: 1
Runcheng Liu, Ahmad Rashid, Ivan Kobyzev, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023), pp. 2380-2389
Open Access | Times Cited: 1
Strength in Numbers: Estimating Confidence of Large Language Models by Prompt Agreement
Gwenyth Portillo Wightman, Alexandra DeLucia, Mark Dredze
(2023), pp. 326-362
Open Access | Times Cited: 1
Gwenyth Portillo Wightman, Alexandra DeLucia, Mark Dredze
(2023), pp. 326-362
Open Access | Times Cited: 1
POND: Multi-Source Time Series Domain Adaptation with Information-Aware Prompt Tuning
Junxiang Wang, Guangji Bai, Wei Cheng, et al.
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2024), pp. 3140-3151
Open Access
Junxiang Wang, Guangji Bai, Wei Cheng, et al.
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2024), pp. 3140-3151
Open Access
Clothes Image Retrieval via Learnable FashionCLIP
Yuan Sun, Mingbo Zhao
Communications in computer and information science (2024), pp. 290-301
Closed Access
Yuan Sun, Mingbo Zhao
Communications in computer and information science (2024), pp. 290-301
Closed Access
LIPT: Improving Prompt Tuning with Late Inception Reparameterization
Yawen He, Ao Feng, Zhengjie Gao, et al.
Electronics (2024) Vol. 13, Iss. 23, pp. 4741-4741
Open Access
Yawen He, Ao Feng, Zhengjie Gao, et al.
Electronics (2024) Vol. 13, Iss. 23, pp. 4741-4741
Open Access
HiCL: Hierarchical Contrastive Learning of Unsupervised Sentence Embeddings
Zhuofeng Wu, Chaowei Xiao, V. G. Vinod Vydiswaran
(2023), pp. 2461-2476
Open Access
Zhuofeng Wu, Chaowei Xiao, V. G. Vinod Vydiswaran
(2023), pp. 2461-2476
Open Access