OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Logit Standardization in Knowledge Distillation
Shangquan Sun, Wenqi Ren, Jingzhi Li, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2024), pp. 15731-15740
Closed Access | Times Cited: 38

Showing 1-25 of 38 citing articles:

Student-friendly knowledge distillation
Mengyang Yuan, Bo Lang, Fengnan Quan
Knowledge-Based Systems (2024) Vol. 296, pp. 111915-111915
Open Access | Times Cited: 11

Adaptive lightweight network construction method for self-knowledge distillation
Siyuan Lu, Weiliang Zeng, Xueshi Li, et al.
Neurocomputing (2025), pp. 129477-129477
Closed Access | Times Cited: 1

Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
Haoxuan Chen, Huamao Huang, Yangyang Peng, et al.
Agriculture (2025) Vol. 15, Iss. 3, pp. 301-301
Open Access | Times Cited: 1

Applications of knowledge distillation in remote sensing: A survey
Yassine Himeur, Nour Aburaed, Omar Elharrouss, et al.
Information Fusion (2024), pp. 102742-102742
Closed Access | Times Cited: 4

A Feature Map Fusion Self-Distillation Scheme for Image Classification Networks
Zhenkai Qin, Shuiping Ni, Mingfu Zhu, et al.
Electronics (2025) Vol. 14, Iss. 1, pp. 182-182
Open Access

RMKD: Relaxed Matching Knowledge Distillation for Short-Length SSVEP-Based Brain-Computer Interfaces
Zhen Lan, Zixing Li, Chao Yan, et al.
Neural Networks (2025) Vol. 185, pp. 107133-107133
Closed Access

Development of a Lightweight Model for Rice Plant Counting and Localization Using UAV-Captured RGB Imagery
Haoran Sun, Siqiao Tan, Zhengliang Luo, et al.
Agriculture (2025) Vol. 15, Iss. 2, pp. 122-122
Open Access

Consistency knowledge distillation based on similarity attribute graph guidance
Jiaqi Ma, Jinfu Yang, Fuji Fu, et al.
Expert Systems with Applications (2025), pp. 126395-126395
Closed Access

Boundary-sensitive Adaptive Decoupled Knowledge Distillation For Acne Grading
Xinyang Zhou, Wenjie Liu, Lei Zhang, et al.
Applied Intelligence (2025) Vol. 55, Iss. 6
Closed Access

Unambiguous granularity distillation for asymmetric image retrieval
Hongrui Zhang, Yi Xie, Haoquan Zhang, et al.
Neural Networks (2025), pp. 107303-107303
Closed Access

Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregation
Zhiwei Tang, Shuguang Xu, Haozhe Jin, et al.
Multimedia Systems (2025) Vol. 31, Iss. 2
Closed Access

FedDyH: A Multi-Policy with GA Optimization Framework for Dynamic Heterogeneous Federated Learning
Xuhua Zhao, Yongming Zheng, Jinjin Wan, et al.
Biomimetics (2025) Vol. 10, Iss. 3, pp. 185-185
Open Access

An accurate and efficient self-distillation method with channel-based feature enhancement via feature calibration and attention fusion for Internet of Things
Qian Zheng, Shengbo Chen, Guanghui Wang, et al.
Future Generation Computer Systems (2025), pp. 107816-107816
Closed Access

Object detection with dynamic high-/low-frequency knowledge distillation for real-world degradation
Junyi Zhao, Jinbao Li, Xinjie Chen, et al.
Alexandria Engineering Journal (2025) Vol. 124, pp. 110-120
Closed Access

A contrast enhanced representation normalization approach to knowledge distillation
Zhiqiang Bao, Di Zhu, Liang Du, et al.
Scientific Reports (2025) Vol. 15, Iss. 1
Open Access

Multi-modal Siamese Ensemble for Neovascular AMD Classification and Prediction from Optical Coherence Tomography
Samuel Richard, Marie Beurton‐Aimar
Lecture notes in computer science (2025), pp. 211-221
Closed Access

Swapped logit distillation via bi-level teacher alignment
Stephen Ekaputra Limantoro, Jhe-Hao Lin, Chih-Yu Wang, et al.
Multimedia Systems (2025) Vol. 31, Iss. 3
Closed Access

Self-distillation salient object detection via generalized diversity loss
Yunfei Zheng, Jibin Yang, Haijun Tao, et al.
Pattern Recognition (2025), pp. 111804-111804
Closed Access

Cross-domain visual prompting with spatial proximity knowledge distillation for histological image classification
Xiaohong Li, Guoheng Huang, Lianglun Cheng, et al.
Journal of Biomedical Informatics (2024) Vol. 158, pp. 104728-104728
Closed Access | Times Cited: 3

Knowledge distillation from relative distribution
Pengfei Gao, Jiaohua Qin, Xuyu Xiang, et al.
Expert Systems with Applications (2025), pp. 127736-127736
Closed Access

Gap-KD: Bridging the Significant Capacity Gap Between Teacher and Student Model
Shan Huang, Wenhua Qian
Lecture notes in computer science (2025), pp. 435-453
Closed Access

Knowledge in attention assistant for improving generalization in deep teacher–student models
Sajedeh Morabbi, Hadi Soltanizadeh, Saeed Mozaffari, et al.
International Journal of Modelling and Simulation (2024), pp. 1-17
Closed Access | Times Cited: 1

Few-Shot Learning Based on Dimensionally Enhanced Attention and Logit Standardization Self-Distillation
Y. Tang, Guang Li, Ming Zhang, et al.
Electronics (2024) Vol. 13, Iss. 15, pp. 2928-2928
Open Access

Provably Convergent Learned Inexact Descent Algorithm for Low-Dose CT Reconstruction
Qingchao Zhang, Mehrdad Alvandipour, Wenjun Xia, et al.
Journal of Scientific Computing (2024) Vol. 101, Iss. 1
Open Access

Page 1 - Next Page

Scroll to top