OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Uttaran Bhattacharya, Trisha Mittal, Rohan Chandra, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2020) Vol. 34, Iss. 02, pp. 1342-1350
Open Access | Times Cited: 94

Showing 1-25 of 94 citing articles:

EmotiCon: Context-Aware Multimodal Emotion Recognition Using Frege’s Principle
Trisha Mittal, Pooja Guhan, Uttaran Bhattacharya, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 14222-14231
Open Access | Times Cited: 154

Emotion Recognition From Multiple Modalities: Fundamentals and methodologies
Sicheng Zhao, Guoli Jia, Jufeng Yang, et al.
IEEE Signal Processing Magazine (2021) Vol. 38, Iss. 6, pp. 59-73
Open Access | Times Cited: 113

Multi-task learning for gait-based identity recognition and emotion recognition using attention enhanced temporal graph convolutional network
Weijie Sheng, Xinde Li
Pattern Recognition (2021) Vol. 114, pp. 107868-107868
Closed Access | Times Cited: 100

Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agents
Uttaran Bhattacharya, Nicholas Rewkowski, Abhishek Banerjee, et al.
(2021), pp. 1-10
Open Access | Times Cited: 98

A Survey of Human Gait-Based Artificial Intelligence Applications
Elsa J. Harris, I‐Hung Khoo, Emel Demircan
Frontiers in Robotics and AI (2022) Vol. 8
Open Access | Times Cited: 68

Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression Learning
Uttaran Bhattacharya, Elizabeth Childs, Nicholas Rewkowski, et al.
Proceedings of the 30th ACM International Conference on Multimedia (2021)
Open Access | Times Cited: 61

Unlocking the Emotional World of Visual Media: An Overview of the Science, Research, and Impact of Understanding Emotion
James Z. Wang, Sicheng Zhao, Chenyan Wu, et al.
Proceedings of the IEEE (2023) Vol. 111, Iss. 10, pp. 1236-1286
Open Access | Times Cited: 36

Target and source modality co-reinforcement for emotion understanding from asynchronous multimodal sequences
Dingkang Yang, Yang Liu, Can Huang, et al.
Knowledge-Based Systems (2023) Vol. 265, pp. 110370-110370
Closed Access | Times Cited: 34

Context De-Confounded Emotion Recognition
Dingkang Yang, Zhaoyu Chen, Yuzheng Wang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023) Vol. 113, pp. 19005-19015
Open Access | Times Cited: 30

ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for Socially-Aware Robot Navigation
Venkatraman Narayanan, Bala Murali Manoghar, Vishnu Sashank Dorbala, et al.
2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020)
Open Access | Times Cited: 54

Looking Into Gait for Perceiving Emotions via Bilateral Posture and Movement Graph Convolutional Networks
Yingjie Zhai, Guoli Jia, Yu‐Kun Lai, et al.
IEEE Transactions on Affective Computing (2024) Vol. 15, Iss. 3, pp. 1634-1648
Open Access | Times Cited: 8

Robust Emotion Recognition in Context Debiasing
Dingkang Yang, Kun Yang, M. H. Li, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2024) Vol. 19, pp. 12447-12457
Closed Access | Times Cited: 8

Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping
Uttaran Bhattacharya, Christian Roncal, Trisha Mittal, et al.
Lecture notes in computer science (2020), pp. 145-163
Open Access | Times Cited: 44

Dynamic Emotion Modeling With Learnable Graphs and Graph Inception Network
Amir Shirian, Subarna Tripathi, Tanaya Guha
IEEE Transactions on Multimedia (2021) Vol. 24, pp. 780-790
Open Access | Times Cited: 40

A comparative review of graph convolutional networks for human skeleton-based action recognition
Liqi Feng, Yaqin Zhao, Wenxuan Zhao, et al.
Artificial Intelligence Review (2021) Vol. 55, Iss. 5, pp. 4275-4305
Closed Access | Times Cited: 40

Lively robots: robotic technologies in COVID-19
Shanti Sumartojo, Daniele Lugli
Social & Cultural Geography (2021) Vol. 23, Iss. 9, pp. 1220-1237
Closed Access | Times Cited: 35

Graph Neural Network for Spatiotemporal Data: Methods and Applications
Yun Li, Dazhou Yu, Zhenke Liu, et al.
(2024)
Open Access | Times Cited: 6

MBCFNet: A Multimodal Brain–Computer Fusion Network for human intention recognition
Zhongjie Li, Gaoyan Zhang, Shogo Okada, et al.
Knowledge-Based Systems (2024) Vol. 296, pp. 111826-111826
Closed Access | Times Cited: 6

Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data
Chongyang Wang, Yuan Gao, Akhil Mathur, et al.
Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies (2021) Vol. 5, Iss. 2, pp. 1-27
Open Access | Times Cited: 28

Motion Capture Sensor-Based Emotion Recognition Using a Bi-Modular Sequential Neural Network
Yajurv Bhatia, A. S. M. Hossain Bari, Gee-Sern Hsu, et al.
Sensors (2022) Vol. 22, Iss. 1, pp. 403-403
Open Access | Times Cited: 18

Improved human emotion recognition from body and hand pose landmarks on the GEMEP dataset using machine learning
Ester Martínez-Martín, Antonio Fernández‐Caballero
Expert Systems with Applications (2025) Vol. 269, pp. 126427-126427
Open Access

Multi-anchor adaptive fusion and bi-focus attention for enhanced gait-based emotion recognition
Jincheng Li, Xuejing Dai, Rong Yan, et al.
Scientific Reports (2025) Vol. 15, Iss. 1
Open Access

Emotion Recognition from Physiological Channels Using Graph Neural Network
Tomasz Wierciński, Mateusz Rock, Robert Zwierzycki, et al.
Sensors (2022) Vol. 22, Iss. 8, pp. 2980-2980
Open Access | Times Cited: 16

Graph Neural Networks in Computer Vision - Architectures, Datasets and Common Approaches
Maciej Krzywda, Szymon Łukasik, Amir H. Gandomi
2022 International Joint Conference on Neural Networks (IJCNN) (2022), pp. 1-10
Open Access | Times Cited: 15

Emotion Recognition Based on Body and Context Fusion in the Wild
Yibo Huang, Hongqian Wen, Linbo Qing, et al.
(2021), pp. 3602-3610
Closed Access | Times Cited: 20

Page 1 - Next Page

Scroll to top