Backward spatial perception can be augmented through a novel visual-to-auditory sensory substitution algorithm

Ophir Netzer, Benedetta Heimler, Amir Shur, Tomer Behor, Amir Amedi

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Can humans extend and augment their natural perceptions during adulthood? Here, we address this fascinating question by investigating the extent to which it is possible to successfully augment visual spatial perception to include the backward spatial field (a region where humans are naturally blind) via other sensory modalities (i.e., audition). We thus developed a sensory-substitution algorithm, the “Topo-Speech” which conveys identity of objects through language, and their exact locations via vocal-sound manipulations, namely two key features of visual spatial perception. Using two different groups of blindfolded sighted participants, we tested the efficacy of this algorithm to successfully convey location of objects in the forward or backward spatial fields following ~ 10 min of training. Results showed that blindfolded sighted adults successfully used the Topo-Speech to locate objects on a 3 × 3 grid either positioned in front of them (forward condition), or behind their back (backward condition). Crucially, performances in the two conditions were entirely comparable. This suggests that novel spatial sensory information conveyed via our existing sensory systems can be successfully encoded to extend/augment human perceptions. The implications of these results are discussed in relation to spatial perception, sensory augmentation and sensory rehabilitation.

Original languageEnglish
Article number11944
JournalScientific Reports
Volume11
Issue number1
DOIs
StatePublished - 7 Jun 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2021, The Author(s).

Fingerprint

Dive into the research topics of 'Backward spatial perception can be augmented through a novel visual-to-auditory sensory substitution algorithm'. Together they form a unique fingerprint.

Cite this