OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception
Máté Aller, Heidi Solberg Økland, Lucy MacGregor, et al.
Journal of Neuroscience (2022) Vol. 42, Iss. 31, pp. 6108-6120
Open Access | Times Cited: 20

Showing 20 citing articles:

Multi-timescale neural dynamics for multisensory integration
Daniel Senkowski, Andreas K. Engel
Nature reviews. Neuroscience (2024) Vol. 25, Iss. 9, pp. 625-642
Closed Access | Times Cited: 19

Unimodal speech perception predicts stable individual differences in audiovisual benefit for phonemes, words and sentences
Jacqueline von Seth, Máté Aller, Matthew H. Davis
The Journal of the Acoustical Society of America (2025) Vol. 157, Iss. 3, pp. 1554-1576
Open Access | Times Cited: 1

Neural speech tracking contribution of lip movements predicts behavioral deterioration when the speaker's mouth is occluded
Patrick Reisinger, Marlies Gillis, Nina Suess, et al.
eNeuro (2025), pp. ENEURO.0368-24.2024
Open Access

Atypical audio-visual neural synchrony and speech processing in early autism
Xiaoyue Wang, Sophie Bouton, Nada Kojovic, et al.
Journal of Neurodevelopmental Disorders (2025) Vol. 17, Iss. 1
Open Access

Neural Speech Tracking Highlights the Importance of Visual Speech in Multi-speaker Situations
Chandra Leon Haider, Hyojin Park, Anne Hauswald, et al.
Journal of Cognitive Neuroscience (2023) Vol. 36, Iss. 1, pp. 128-142
Open Access | Times Cited: 7

Intelligibility improves perception of timing changes in speech
Benedikt Zoefel, Rebecca A. Gilbert, Matthew H. Davis
PLoS ONE (2023) Vol. 18, Iss. 1, pp. e0279024-e0279024
Open Access | Times Cited: 6

Neural speech tracking benefit of lip movements predicts behavioral deterioration when the speaker’s mouth is occluded
Patrick Reisinger, Marlies Gillis, Nina Suess, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2023)
Open Access | Times Cited: 5

Hearing Lips in Noise: Universal Viseme-Phoneme Mapping and Transfer for Robust Audio-Visual Speech Recognition
Yu‐Chen Hu, Ruizhe Li, Chen Chen, et al.
(2023), pp. 15213-15232
Open Access | Times Cited: 4

Decreasing hearing ability does not lead to improved visual speech extraction as revealed in a neural speech tracking paradigm
Chandra Leon Haider, Anne Hauswald, Nathan Weisz
bioRxiv (Cold Spring Harbor Laboratory) (2024)
Open Access | Times Cited: 1

Atypical Audio-Visual Neural Synchrony and Speech Processing in children with Autism Spectrum Disorder
Xiaoyue Wang, Sophie Bouton, Nada Kojovic, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2024)
Open Access | Times Cited: 1

Accumulated reserves hold back age-related neural compensation in speech-in-noise perception
Claude Alain, Lei Zhang, Bernhard Roß, et al.
Research Square (Research Square) (2024)
Closed Access

Visual Congruency Modulates Music Reward through Sensorimotor Integration
Lei Zhang, Yi Du, Robert J. Zatorre
bioRxiv (Cold Spring Harbor Laboratory) (2024)
Open Access

Pupil diameter as an indicator of sound pair familiarity after statistically structured auditory sequence
Janika Becker, Christoph W. Korn, Helen Blank
Scientific Reports (2024) Vol. 14, Iss. 1
Open Access

Lip movements and lexical features improve speech tracking differently for clear and multi-speaker speech
Chandra Leon Haider, Hyojin Park, Anne Hauswald, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2023)
Open Access

Seeing a Talking Face Matters: Gaze Behavior and the Auditory–Visual Speech Benefit in Adults' Cortical Tracking of Infant-directed Speech
S.H. Jessica Tan, Marina Kalashnikova, Giovanni M. Di Liberto, et al.
Journal of Cognitive Neuroscience (2023) Vol. 35, Iss. 11, pp. 1741-1759
Closed Access

Page 1

Scroll to top