OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Emotion Recognition from Skeletal Movements
Tomasz Sapiński, Dorota Kamińska, A. Pelikant, et al.
Entropy (2019) Vol. 21, Iss. 7, pp. 646-646
Open Access | Times Cited: 88

Showing 1-25 of 88 citing articles:

Human Emotion Recognition: Review of Sensors and Methods
Andrius Dzedzickis, Artūras Kaklauskas, Vytautas Bučinskas
Sensors (2020) Vol. 20, Iss. 3, pp. 592-592
Open Access | Times Cited: 451

Multimodal Emotion Recognition with Deep Learning: Advancements, challenges, and future directions
Geetha Vijayaraghavan, T. Mala, Das P, et al.
Information Fusion (2023) Vol. 105, pp. 102218-102218
Closed Access | Times Cited: 60

Usability Testing of Virtual Reality Applications—The Pilot Study
Dorota Kamińska, Grzegorz Zwoliński, Anna Laska-Leśniewicz
Sensors (2022) Vol. 22, Iss. 4, pp. 1342-1342
Open Access | Times Cited: 56

New Trends in Emotion Recognition Using Image Analysis by Neural Networks, a Systematic Review
Andrada-Livia Cîrneanu, Dan Popescu, Dragoș Daniel Iordache
Sensors (2023) Vol. 23, Iss. 16, pp. 7092-7092
Open Access | Times Cited: 35

A Survey of Deep Learning-Based Multimodal Emotion Recognition: Speech, Text, and Face
Hailun Lian, Cheng Lu, Sunan Li, et al.
Entropy (2023) Vol. 25, Iss. 10, pp. 1440-1440
Open Access | Times Cited: 34

Depth Sensors-Based Action Recognition Using a Modified K-Ary Entropy Classifier
Mouazma Batool, Saud S. Alotaibi, Mohammed Alatiyyah, et al.
IEEE Access (2023) Vol. 11, pp. 58578-58595
Open Access | Times Cited: 29

Machine learning for human emotion recognition: a comprehensive review
Eman M. G. Younis, Someya Mohsen, Essam H. Houssein, et al.
Neural Computing and Applications (2024) Vol. 36, Iss. 16, pp. 8901-8947
Open Access | Times Cited: 11

Real-Time Human Action Recognition with a Low-Cost RGB Camera and Mobile Robot Platform
JunWoo Lee, Bummo Ahn
Sensors (2020) Vol. 20, Iss. 10, pp. 2886-2886
Open Access | Times Cited: 54

Detection of Mental Stress through EEG Signal in Virtual Reality Environment
Dorota Kamińska, Krzysztof Smółka, Grzegorz Zwoliński
Electronics (2021) Vol. 10, Iss. 22, pp. 2840-2840
Open Access | Times Cited: 48

Learning facial expression and body gesture visual information for video emotion recognition
Wei Jie, Guanyu Hu, Xinyu Yang, et al.
Expert Systems with Applications (2023) Vol. 237, pp. 121419-121419
Closed Access | Times Cited: 20

Emerging Frontiers in Human–Robot Interaction
Farshad Safavi, Parthan Olikkal, Dingyi Pei, et al.
Journal of Intelligent & Robotic Systems (2024) Vol. 110, Iss. 2
Open Access | Times Cited: 8

Recognizing affective states from the expressive behavior of tennis players using convolutional neural networks
Darko Jekauc, Diana Burkart, Julian Fritsch, et al.
Knowledge-Based Systems (2024) Vol. 295, pp. 111856-111856
Open Access | Times Cited: 7

Identifying Emotions from Walking using Affective and Deep Features
Tanmay Randhavane, Aniket Bera, Kyra Kapsaskis, et al.
arXiv (Cornell University) (2019)
Open Access | Times Cited: 52

Two-Stage Recognition and beyond for Compound Facial Emotion Recognition
Dorota Kamińska, Kadir Aktas, Davit Rizhinashvili, et al.
Electronics (2021) Vol. 10, Iss. 22, pp. 2847-2847
Open Access | Times Cited: 37

A Survey on Different Computer Vision Based Human Activity Recognition for Surveillance Applications
Ashwin Shenoy M, N. Thillaiarasu
2022 6th International Conference on Computing Methodologies and Communication (ICCMC) (2022), pp. 1372-1376
Closed Access | Times Cited: 23

E-textiles for emotion interaction: a scoping review of trends and opportunities
Mengqi Jiang, Yimin Wang, Vijayakumar Nanjappan, et al.
Personal and Ubiquitous Computing (2024) Vol. 28, Iss. 3-4, pp. 549-577
Closed Access | Times Cited: 5

Stress Reduction Using Bilateral Stimulation in Virtual Reality
Dorota Kamińska, Krzysztof Smółka, Grzegorz Zwoliński, et al.
IEEE Access (2020) Vol. 8, pp. 200351-200366
Open Access | Times Cited: 38

Speech feature selection and emotion recognition based on weighted binary cuckoo search
Zicheng Zhang
Alexandria Engineering Journal (2020) Vol. 60, Iss. 1, pp. 1499-1507
Open Access | Times Cited: 35

Machine Learning Algorithms for Detection and Classifications of Emotions in Contact Center Applications
Mirosław Płaza, Sławomir Trusz, Justyna Kęczkowska, et al.
Sensors (2022) Vol. 22, Iss. 14, pp. 5311-5311
Open Access | Times Cited: 22

A deep learning-based approach for emotional analysis of sports dance
Qunqun Sun, Xiangjun Wu
PeerJ Computer Science (2023) Vol. 9, pp. e1441-e1441
Open Access | Times Cited: 12

Athlete body power and strength estimation using skeleton point cloud
H. Rangala, Sudath Samaraweera, K. D. Sandaruwan, et al.
Journal of the National Science Foundation of Sri Lanka (2025) Vol. 52, Iss. 4, pp. 481-492
Open Access

Improved human emotion recognition from body and hand pose landmarks on the GEMEP dataset using machine learning
Ester Martínez-Martín, Antonio Fernández‐Caballero
Expert Systems with Applications (2025) Vol. 269, pp. 126427-126427
Open Access

From needs to control: a review of indicators and sensing technologies for occupant-centric smart lighting systems
Yuxiao Wang, Xin Zhang, Hongwei Chen
Energy and Buildings (2025), pp. 115740-115740
Closed Access

Skeleton-Based Emotion Recognition Based on Two-Stream Self-Attention Enhanced Spatial-Temporal Graph Convolutional Network
Jiaqi Shi, Chaoran Liu, Carlos Toshinori Ishi, et al.
Sensors (2020) Vol. 21, Iss. 1, pp. 205-205
Open Access | Times Cited: 32

Page 1 - Next Page

Scroll to top