Skip to main navigation Skip to search Skip to main content

Augmentation of self-motion perception with synthetic auditory cues

Research output: Contribution to journalArticlepeer-review

Abstract

This study tested whether a synthetic auditory cue, designed to encode translational self-motion, can augment vestibular perception. Twenty adults sat on a motion platform and judged whether forward translations were to the left or right of straight ahead (heading discrimination). Stimuli comprised vestibular-only, auditory-only, or combined vestibular-auditory cues. The auditory cue, presented via headphones, comprised a series of beeps, with motion speed encoded by beep rate, and heading direction encoded by simulating the sound to emanate from that direction. Combined-cue performance was better in comparison to vestibular-only. However, cue integration did not follow Bayesian predictions: the vestibular cue was overweighted. Moreover, combined-cue thresholds were better predicted by the empirically observed—rather than Bayesian-predicted—cue weights. Thus, humans can integrate synthetic auditory cues with natural vestibular cues to improve self-motion perception. However, they underweight the synthetic auditory cues. This suggests that cue weighting is determined not only by reliability, but also by inferred relevance.

Original languageEnglish
Article number114885
JournaliScience
Volume29
Issue number3
DOIs
StatePublished - 20 Mar 2026

Bibliographical note

Publisher Copyright:
© 2026 The Authors

Keywords

  • biological sciences
  • clinical neuroscience
  • natural sciences
  • neuroscience

Fingerprint

Dive into the research topics of 'Augmentation of self-motion perception with synthetic auditory cues'. Together they form a unique fingerprint.

Cite this