OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Hierarchical multimodal-fusion of physiological signals for emotion recognition with scenario adaption and contrastive alignment
Jiehao Tang, Zhuang Ma, Kaiyu Gan, et al.
Information Fusion (2023) Vol. 103, pp. 102129-102129
Closed Access | Times Cited: 25

Showing 25 citing articles:

A Review of Key Technologies for Emotion Analysis Using Multimodal Information
Xianxun Zhu, Chaopeng Guo, Heyang Feng, et al.
Cognitive Computation (2024) Vol. 16, Iss. 4, pp. 1504-1530
Closed Access | Times Cited: 23

Multimodal emotion recognition by fusing complementary patterns from central to peripheral neurophysiological signals across feature domains
Zhuang Ma, Ao Li, Jiehao Tang, et al.
Engineering Applications of Artificial Intelligence (2025) Vol. 143, pp. 110004-110004
Closed Access | Times Cited: 1

Uncertainty-Aware Graph Contrastive Fusion Network for multimodal physiological signal emotion recognition
Guangqiang Li, Ning Chen, Hongqing Zhu, et al.
Neural Networks (2025) Vol. 187, pp. 107363-107363
Closed Access | Times Cited: 1

CFDA-CSF: A Multi-Modal Domain Adaptation Method for Cross-Subject Emotion Recognition
Magdiel Jiménez-Guarneros, Gibrán Fuentes-Pineda
IEEE Transactions on Affective Computing (2024) Vol. 15, Iss. 3, pp. 1502-1513
Closed Access | Times Cited: 6

Gait Recognition Based on A-Mode Ultrasound and Inertial Sensor Fusion Systems
Xin Huang, Haoran Zheng, ZIQING zhou, et al.
Lecture notes in computer science (2025), pp. 192-205
Closed Access

Contrastive reinforced transfer learning for EEG-based emotion recognition with consideration of individual differences
Zhibang Zang, Xiangkun Yu, Baole Fu, et al.
Biomedical Signal Processing and Control (2025) Vol. 106, pp. 107622-107622
Closed Access

From screens to scenes: A survey of embodied AI in healthcare
Yihao Liu, Xu Cao, Tingting Chen, et al.
Information Fusion (2025), pp. 103033-103033
Closed Access

CAT-LCAN: A Multimodal Physiological Signal Fusion Framework for Emotion Recognition
Ao Li, Zhao Lv, Xinhui Li
Lecture notes in computer science (2025), pp. 168-177
Closed Access

A Systematic Review on Artificial Intelligence-Based Multimodal Dialogue Systems Capable of Emotion Recognition
Luis Bravo, Ciro Rodríguez, Pedro Hidalgo, et al.
Multimodal Technologies and Interaction (2025) Vol. 9, Iss. 3, pp. 28-28
Open Access

PIKGMA: PrIori Knowledge-Guided Multimodal Alignment And Domain Adaptation For Emotion Recognition
Chenglin Lin, Huimin Lu, Songzhe Ma, et al.
(2025), pp. 137-143
Closed Access

Multi-modal sentiment recognition with residual gating network and emotion intensity attention
Yadi Wang, Xiaoding Guo, Xianhong Hou, et al.
Neural Networks (2025) Vol. 188, pp. 107483-107483
Closed Access

CD3Net: A Contrastive Diffusion Model with Domain Adaptive Data Synthetic Network for Motor Imagery and Emotion classification
Qiaoli Zhou, Xiaoyun Ye, Shurui Li, et al.
Biomedical Signal Processing and Control (2025) Vol. 108, pp. 107799-107799
Closed Access

Emotion recognition based on time-scale heterogeneity and hierarchical spatial coupling analysis of multimodal physiological signals
Zhangyong Xu, Ning Chen, Guangqiang Li, et al.
Expert Systems with Applications (2025), pp. 128035-128035
Closed Access

Emotion recognition via affective EEG signals: State of the art
Wei Meng, Fazheng Hou, Mengyuan Zhao, et al.
Neurocomputing (2025), pp. 130418-130418
Closed Access

Emotion Recognition Using EEG Signals and Audiovisual Features with Contrastive Learning
Ju-Hwan Lee, Jin Young Kim, Hyoung‐Gook Kim
Bioengineering (2024) Vol. 11, Iss. 10, pp. 997-997
Open Access | Times Cited: 3

Enhancing local representation learning through global–local integration with functional connectivity for EEG-based emotion recognition
Baole Fu, Xiangkun Yu, Guijie Jiang, et al.
Computers in Biology and Medicine (2024) Vol. 179, pp. 108857-108857
Closed Access | Times Cited: 1

A bidirectional cross-modal transformer representation learning model for EEG-fNIRS multimodal affective BCI
Xiaopeng Si, Shuai Zhang, Zhuobin Yang, et al.
Expert Systems with Applications (2024) Vol. 266, pp. 126081-126081
Closed Access | Times Cited: 1

STM-Net based spatial-temporal multi-modal fusion network for emotion recognition
Lina Li, Wenjie Deng, Shengli Liao, et al.
(2024) Vol. 21, pp. 268-268
Closed Access

A Multimodal Driver Anger Recognition Method Based on Context-Awareness
Tongqiang Ding, Kexin Zhang, Shuai Gao, et al.
IEEE Access (2024) Vol. 12, pp. 118533-118550
Open Access

DEMA: Deep EEG-first multi-physiological affect model for emotion recognition
Qiaomei Li, Donghui Jin, Jun Huang, et al.
Biomedical Signal Processing and Control (2024) Vol. 99, pp. 106812-106812
Closed Access

Correlation-Driven Multi-Modality Graph Decomposition for Cross-Subject Emotion Recognition
Wuliang Huang, Yiqiang Chen, Xinlong Jiang, et al.
(2024), pp. 2272-2281
Closed Access

Multi-view brain functional connectivity and hierarchical fusion for EEG-based emotion recognition
Baole Fu, Xiangkun Yu, Feng Wu, et al.
Measurement (2024), pp. 116046-116046
Closed Access

Enhancing emotion recognition through brain asymmetry and multi-angle fusion network
Baitao Zhou, Lin Lin, Jian Chen
Biomedical Signal Processing and Control (2024) Vol. 102, pp. 107324-107324
Closed Access

Page 1

Scroll to top