Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01n583xx59k
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorHasson, Uri-
dc.contributor.authorRegev, Mor-
dc.contributor.otherPsychology Department-
dc.date.accessioned2017-07-17T21:06:59Z-
dc.date.available2017-07-17T21:06:59Z-
dc.date.issued2017-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01n583xx59k-
dc.description.abstractLinguistic content can be conveyed both in speech and in writing. But how does the human brain accommodate the much later evolved skill of literacy—visual language—on top of the previously developed ability to comprehend auditory speech? This dissertation seeks to explore which neural processes are common and which diverge across the auditory and visual language modalities. Using functional magnetic resonance imaging, we perused four goals: 1) map areas that reliably activate during reading of complex written narrative, and identify which of these regions are unique to reading and not to speaking; 2) map the regions that show functional similarities when reading and when listening to the same narrative; 3) examine the extent to which top-down attention modulates response to text and speech along their processing hierarchies; and 4) identify the neural pathways through which written and spoken attended and unattended content is distributed through the brain. First, we used inter-subject correlation to map the extended reading network that activates reliably during text comprehension. Early sensory areas and some high-order parietal and frontal areas responded selectively to written and spoken versions of the narrative. By contrast, the temporal response profiles in some linguistic and extra-linguistic areas were remarkably similar for spoken and written narratives, indicating strong modality-invariance of linguistic processing in these circuits. Next, we revealed a similar hierarchical modulation effect of attention on spoken and written content, in which sensory cortices processed unattended language, unlike high-order cognitive cortices. Further, the neural communication between brain regions was examined using inter-subject functional correlation analysis, which demonstrated that although story-related responses were shared between areas in sensory cortices even when outside the focus of attention, top-down attention was required for the narrative-specific response to be shared with higher, more cognitive frontal and parietal areas. Overall, the results presented in this dissertation argue that our ability to extract the same information from spoken and written forms arises from a mixture of both selective and shared neural processes. This constitutes a major step in the understanding of the neurophysiological and functional dynamics underlying the coexistence of written and spoken language in the brain.-
dc.language.isoen-
dc.publisherPrinceton, NJ : Princeton University-
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a>-
dc.subject.classificationCognitive psychology-
dc.subject.classificationNeurosciences-
dc.titleShared and Selective Neural Processes Across the Visual and Auditory Language Modalities-
dc.typeAcademic dissertations (Ph.D.)-
pu.projectgrantnumber690-2143-
Appears in Collections:Psychology

Files in This Item:
File Description SizeFormat
Regev_princeton_0181D_12072.pdf7.94 MBAdobe PDF

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.