Keeping in time with social and non-social stimuli: Synchronisation with auditory, visual, and audio-visual cues

Juliane J. Honisch, Prasannajeet Mane, Ofer Golan, Bhismadev Chakrabarti

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Everyday social interactions require us to closely monitor, predict, and synchronise our movements with those of an interacting partner. Experimental studies of social synchrony typically examine the social-cognitive outcomes associated with synchrony, such as affiliation. On the other hand, research on the sensorimotor aspects of synchronisation generally uses non-social stimuli (e.g. a moving dot). To date, the differences in sensorimotor aspects of synchronisation to social compared to non-social stimuli remain largely unknown. The present study aims to address this gap using a verbal response paradigm where participants were asked to synchronise a ‘ba’ response in time with social and non-social stimuli, which were presented auditorily, visually, or audio-visually combined. For social stimuli a video/audio recording of an actor performing the same verbal ‘ba’ response was presented, whereas for non-social stimuli a moving dot, an auditory metronome or both combined were presented. The impact of autistic traits on participants’ synchronisation performance was examined using the Autism Spectrum Quotient (AQ). Our results revealed more accurate synchronisation for social compared to non-social stimuli, suggesting that greater familiarity with and motivation in attending to social stimuli may enhance our ability to better predict and synchronise with them. Individuals with fewer autistic traits demonstrated greater social learning, as indexed through an improvement in synchronisation performance to social vs non-social stimuli across the experiment.

Original languageEnglish
Article number8805
JournalScientific Reports
Volume11
Issue number1
DOIs
StatePublished - 22 Apr 2021

Bibliographical note

Publisher Copyright:
© 2021, The Author(s).

Funding

We would like to thank Nadyne Dunkley, Makena Peart, Hristina Mihaylova, and Dr Anthony Haffey for helping with the data collection and stimuli creation, Dagmar S. Fraser for his programming support for the stimuli presentation, and thank all the volunteers for participation and feedback. BC was supported by UK Medical Research Council (MR/P023894/1, MR/S036423/1) during this project.

FundersFunder number
Medical Research CouncilMR/S036423/1, MR/P023894/1

    Fingerprint

    Dive into the research topics of 'Keeping in time with social and non-social stimuli: Synchronisation with auditory, visual, and audio-visual cues'. Together they form a unique fingerprint.

    Cite this