Abstract
This study tested whether a synthetic auditory cue, designed to encode translational self-motion, can augment vestibular perception. Twenty adults sat on a motion platform and judged whether forward translations were to the left or right of straight ahead (heading discrimination). Stimuli comprised vestibular-only, auditory-only, or combined vestibular-auditory cues. The auditory cue, presented via headphones, comprised a series of beeps, with motion speed encoded by beep rate, and heading direction encoded by simulating the sound to emanate from that direction. Combined-cue performance was better in comparison to vestibular-only. However, cue integration did not follow Bayesian predictions: the vestibular cue was overweighted. Moreover, combined-cue thresholds were better predicted by the empirically observed—rather than Bayesian-predicted—cue weights. Thus, humans can integrate synthetic auditory cues with natural vestibular cues to improve self-motion perception. However, they underweight the synthetic auditory cues. This suggests that cue weighting is determined not only by reliability, but also by inferred relevance.
| Original language | English |
|---|---|
| Article number | 114885 |
| Journal | iScience |
| Volume | 29 |
| Issue number | 3 |
| DOIs | |
| State | Published - 20 Mar 2026 |
Bibliographical note
Publisher Copyright:© 2026 The Authors
Keywords
- biological sciences
- clinical neuroscience
- natural sciences
- neuroscience
Fingerprint
Dive into the research topics of 'Augmentation of self-motion perception with synthetic auditory cues'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver