OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

ASTDF-Net: Attention-Based Spatial-Temporal Dual-Stream Fusion Network for EEG-Based Emotion Recognition
Peiliang Gong, Ziyu Jia, Pengpai Wang, et al.
(2023), pp. 883-892
Closed Access | Times Cited: 16

Showing 16 citing articles:

Multi-modal Mood Reader: Pre-trained Model Empowers Cross-Subject Emotion Recognition
Yihang Dong, Xuhang Chen, Yanyan Shen, et al.
Communications in computer and information science (2024), pp. 178-192
Closed Access | Times Cited: 4

A neural approach to the Turing Test: The role of emotions
Rita Pizzi, Hao Quan, Matteo Matteucci, et al.
Neural Networks (2025), pp. 107362-107362
Open Access

Generalized multisensor wearable signal fusion for emotion recognition from noisy and incomplete data
Vamsi Kumar Naidu Pallapothula, Sidharth Anand, Sreyasee Das Bhattacharjee, et al.
Smart Health (2025), pp. 100571-100571
Closed Access

ChannelMix-based transformer and convolutional multi-view feature fusion network for unsupervised domain adaptation in EEG emotion recognition
Chengpeng Sun, Xiujuan Wang, Liubing Chen
Expert Systems with Applications (2025) Vol. 280, pp. 127456-127456
Closed Access

EEG-based emotion recognition using graph convolutional neural network with dual attention mechanism
Wei Chen, Yuan Liao, Rui Dai, et al.
Frontiers in Computational Neuroscience (2024) Vol. 18
Open Access | Times Cited: 1

Brant-X: A Unified Physiological Signal Alignment Framework
D X Zhang, Zhizhang Yuan, Junru Chen, et al.
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2024), pp. 4155-4166
Open Access | Times Cited: 1

Joint Contrastive Learning with Feature Alignment for Cross-Corpus EEG-based Emotion Recognition
Qile Liu, Zhihao Zhou, Jiyuan Wang, et al.
(2024) Vol. 29, pp. 9-17
Closed Access | Times Cited: 1

Set-pMAE: spatial-spEctral-temporal based parallel masked autoEncoder for EEG emotion recognition
Chenyu Pan, Huimin Lu, Chenglin Lin, et al.
Cognitive Neurodynamics (2024) Vol. 18, Iss. 6, pp. 3757-3773
Closed Access

Modality- and Subject-Aware Emotion Recognition Using Knowledge Distillation
Mehmet Ali Sarikaya, Gökhan İnce
IEEE Access (2024) Vol. 12, pp. 122485-122502
Open Access

EEG spatial projection and an improved 3D CNN with channel spatiotemporal joint attention mechanism for emotion recognition
Ni Yao, Haitao Su, Duan Li, et al.
Signal Image and Video Processing (2024)
Closed Access

Online Multi-level Contrastive Representation Distillation for Cross-Subject fNIRS Emotion Recognition
Zhihui Lai, Chunmei Qing, Junpeng Tan, et al.
(2024) Vol. 4, pp. 29-37
Closed Access

Correlation-Driven Multi-Modality Graph Decomposition for Cross-Subject Emotion Recognition
Wuliang Huang, Yiqiang Chen, Xinlong Jiang, et al.
(2024), pp. 2272-2281
Closed Access

Enhancing cross-subject emotion recognition precision through unimodal EEG: a novel emotion preceptor model
Yihang Dong, Changhong Jing, Mufti Mahmud, et al.
Brain Informatics (2024) Vol. 11, Iss. 1
Open Access

Page 1

Scroll to top