
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
Self-Evolution Learning for Discriminative Language Model Pretraining
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023)
Open Access | Times Cited: 5
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2023)
Open Access | Times Cited: 5
Showing 5 citing articles:
E2S2: Encoding-Enhanced Sequence-to-Sequence Pretraining for Language Understanding and Generation
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
IEEE Transactions on Knowledge and Data Engineering (2023) Vol. 36, Iss. 12, pp. 8037-8050
Open Access | Times Cited: 20
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
IEEE Transactions on Knowledge and Data Engineering (2023) Vol. 36, Iss. 12, pp. 8037-8050
Open Access | Times Cited: 20
PanDa: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
IEEE Transactions on Knowledge and Data Engineering (2024) Vol. 36, Iss. 9, pp. 4835-4848
Open Access | Times Cited: 8
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
IEEE Transactions on Knowledge and Data Engineering (2024) Vol. 36, Iss. 9, pp. 4835-4848
Open Access | Times Cited: 8
Revisiting Token Dropping Strategy in Efficient BERT Pretraining
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
(2023)
Open Access | Times Cited: 4
Qihuang Zhong, Liang Ding, Juhua Liu, et al.
(2023)
Open Access | Times Cited: 4
Give Me the Facts! A Survey on Factual Knowledge Probing in Pre-trained Language Models
Paul Youssef, Osman Alperen Koraş, Meijie Li, et al.
(2023), pp. 15588-15605
Open Access | Times Cited: 3
Paul Youssef, Osman Alperen Koraş, Meijie Li, et al.
(2023), pp. 15588-15605
Open Access | Times Cited: 3
Zero-shot Sharpness-Aware Quantization for Pre-trained Language Models
Miaoxi Zhu, Qihuang Zhong, Li Shen, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 11305-11327
Open Access | Times Cited: 2
Miaoxi Zhu, Qihuang Zhong, Li Shen, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2023), pp. 11305-11327
Open Access | Times Cited: 2