OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Train No Evil: Selective Masking for Task-Guided Pre-Training
Yuxian Gu, Zhengyan Zhang, Xiaozhi Wang, et al.
(2020)
Open Access | Times Cited: 44

Showing 1-25 of 44 citing articles:

CLEVE: Contrastive Pre-training for Event Extraction
Ziqi Wang, Xiaozhi Wang, Xu Han, et al.
(2021)
Open Access | Times Cited: 91

A novel neural network model fusion approach for improving medical named entity recognition in online health expert question-answering services
Ze Hu, Xiaoning Ma
Expert Systems with Applications (2023) Vol. 223, pp. 119880-119880
Closed Access | Times Cited: 22

Data-driven building load prediction and large language models: Comprehensive overview
Yingkang Zhang, Dijun Wang, Guansong Wang, et al.
Energy and Buildings (2024) Vol. 326, pp. 115001-115001
Closed Access | Times Cited: 4

EntityBERT: Entity-centric Masking Strategy for Model Pretraining for the Clinical Domain
Chen Lin, Timothy M. Miller, Dmitriy Dligach, et al.
(2021), pp. 191-201
Open Access | Times Cited: 27

Aspect Sentiment Triplet Extraction: A Seq2Seq Approach With Span Copy Enhanced Dual Decoder
Zhihao Zhang, Yuan Zuo, Junjie Wu
IEEE/ACM Transactions on Audio Speech and Language Processing (2022) Vol. 30, pp. 2729-2742
Closed Access | Times Cited: 13

A Survey on Dropout Methods and Experimental Verification in Recommendation
Yangkun Li, Weizhi Ma, Chong Chen, et al.
IEEE Transactions on Knowledge and Data Engineering (2022), pp. 1-20
Open Access | Times Cited: 11

Teaching the Pre-trained Model to Generate Simple Texts for Text Simplification
Renliang Sun, Wei Xu, Xiaojun Wan
Findings of the Association for Computational Linguistics: ACL 2022 (2023), pp. 9345-9355
Open Access | Times Cited: 6

Studying Strategically: Learning to Mask for Closed-book QA
Qinyuan Ye, Belinda Z. Li, Sinong Wang, et al.
arXiv (Cornell University) (2020)
Open Access | Times Cited: 13

CSS-LM: A Contrastive Framework for Semi-Supervised Fine-Tuning of Pre-Trained Language Models
Yusheng Su, Xu Han, Yankai Lin, et al.
IEEE/ACM Transactions on Audio Speech and Language Processing (2021) Vol. 29, pp. 2930-2941
Open Access | Times Cited: 12

Continual Knowledge Distillation for Neural Machine Translation
Yuanchi Zhang, Peng Li, Maosong Sun, et al.
(2023), pp. 7978-7996
Open Access | Times Cited: 4

"Len or index or count, anything but v1": Predicting Variable Names in Decompilation Output with Transfer Learning
Kuntal Kumar Pal, Ati Priya Bajaj, Pratyay Banerjee, et al.
2022 IEEE Symposium on Security and Privacy (SP) (2024), pp. 4069-4087
Closed Access | Times Cited: 1

On the Influence of Masking Policies in Intermediate Pre-training
Qinyuan Ye, Belinda Z. Li, Sinong Wang, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 7190-7202
Open Access | Times Cited: 8

A Mask-Based Logic Rules Dissemination Method for Sentiment Classifiers
Shashank Gupta, Mohamed Reda Bouadjenek, Antonio Robles‐Kelly
Lecture notes in computer science (2023), pp. 394-408
Closed Access | Times Cited: 3

Difference-Masking: Choosing What to Mask in Continued Pretraining
Alex Wilf, Syeda Akter, Leena Mathur, et al.
(2023), pp. 13222-13234
Open Access | Times Cited: 3

Fortunately, Discourse Markers Can Enhance Language Models for Sentiment Analysis
Liat Ein‐Dor, Ilya Shnayderman, Artem Spector, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2022) Vol. 36, Iss. 10, pp. 10608-10617
Open Access | Times Cited: 4

APOLLO: A Simple Approach for Adaptive Pretraining of Language Models for Logical Reasoning
Soumya Sanyal, Xu Yi‐chong, Shuohang Wang, et al.
(2023)
Open Access | Times Cited: 2

Task-guided Disentangled Tuning for Pretrained Language Models
Jiali Zeng, Yufan Jiang, Shuangzhi Wu, et al.
Findings of the Association for Computational Linguistics: ACL 2022 (2022), pp. 3126-3137
Open Access | Times Cited: 3

A Self-supervised Joint Training Framework for Document Reranking
Xiaozhi Zhu, Tianyong Hao, Sijie Cheng, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 3

Improving Low Compute Language Modeling with In-Domain Embedding Initialisation
Charles Welch, Rada Mihalcea, Jonathan K. Kummerfeld
(2020), pp. 8625-8634
Open Access | Times Cited: 3

Percy: A Post-Hoc Explanation-Based Score for Logic Rule Dissemination Consistency Assessment in Sentiment Classification
Shashank Gupta, Mohamed Reda Bouadjenek, Antonio Robles‐Kelly
(2023)
Closed Access | Times Cited: 1

Do not Mask Randomly: Effective Domain-adaptive Pre-training by Masking In-domain Keywords
Shahriar Golchin, Mihai Surdeanu, Nazgol Tavabi, et al.
(2023), pp. 13-21
Open Access | Times Cited: 1

Knowledge Graph Enhanced Language Models for Sentiment Analysis
Jie Li, Xuan Li, Linmei Hu, et al.
Lecture notes in computer science (2023), pp. 447-464
Closed Access | Times Cited: 1

Lil-Bevo: Explorations of Strategies for Training Language Models in More Humanlike Ways
Venkata S Govindarajan, Juan Diego Rodríguez, Kaj Bostrom, et al.
(2023), pp. 280-288
Open Access | Times Cited: 1

An Anchor Learning Approach for Citation Field Learning
Zilin Yuan, Borun Chen, Yimeng Dai, et al.
ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2024), pp. 12346-12350
Open Access

Page 1 - Next Page

Scroll to top