Sleep disrupts high-level speech parsing despite significant basic auditory processing

Shiri Makov, Omer Sharon, Nai Ding, Michal Ben-Shachar, Yuval Nir, Elana Zion Golumbic

Research output: Contribution to journalArticlepeer-review

51 Scopus citations


The extent to which the sleeping brain processes sensory information remains unclear. This is particularly true for continuous and complex stimuli such as speech, in which information is organized into hierarchically embedded structures. Recently, novel metrics for assessing the neural representation of continuous speech have been developed using noninvasive brain recordings that have thus far only been tested during wakefulness. Here we investigated, for the first time, the sleeping brain’s capacity to process continuous speech at different hierarchical levels using a newly developed Concurrent Hierarchical Tracking (CHT) approach that allows monitoring the neural representation and processing-depth of continuous speech online. Speech sequences were compiled with syllables, words, phrases, and sentences occurring at fixed time intervals such that different linguistic levels correspond to distinct frequencies. This enabled us to distinguish their neural signatures in brain activity. We compared the neural tracking of intelligible versus unintelligible (scrambled and foreign) speech across states of wakefulness and sleep using high-density EEG in humans. We found that neural tracking of stimulus acoustics was comparable across wakefulness and sleep and similar across all conditions regardless of speech intelligibility. In contrast, neural tracking of higher-order linguistic constructs (words, phrases, and sentences) was only observed for intelligible speech during wakefulness and could not be detected at all during nonrapid eye movement or rapid eye movement sleep. These results suggest that, whereas low-level auditory processing is relatively preserved during sleep, higher-level hierarchical linguistic parsing is severely disrupted, thereby revealing the capacity and limits of language processing during sleep.

Original languageEnglish
Pages (from-to)7772-7781
Number of pages10
JournalJournal of Neuroscience
Issue number32
StatePublished - 2017

Bibliographical note

Funding Information:
This work was supported by the I-CORE Program of the Planning and Budgeting Committee and the Israel Science Foundation (Y.N. and E.Z.G.), an FP7 Marie Curie Career Integration Grant (Y.N. and E.Z.G.), the Israel Science Foundation (Grant 1326/15 to Y.N.), the Binational Science Foundation (BSF) (Grant 2015385 to E.Z.G.), and the Adelis Foundation (Y.N.). We thank Talma Hendler for continuing support at the Tel Aviv Sourasky Medical Center, Shani Shalgi for help setting up the EEG sleep laboratory, Shlomit Beker for assistance in setting up the experiment, Noam Amir for advising on acoustic aspects of stimulus preparation, Netta Neeman for assistance with data acquisition, Noa Bar-Ilan Regev for administrative help, and Yaniv Sela and laboratory members for suggestions.

Publisher Copyright:
© 2017 the authors.


  • Attention
  • Entrainment
  • Sleep
  • Speech processing


Dive into the research topics of 'Sleep disrupts high-level speech parsing despite significant basic auditory processing'. Together they form a unique fingerprint.

Cite this