While the differences between sign language and speech are significant, the underlying neural processes we use to create complex expressions for both are quite similar, new research suggests.
For both signers and speakers, phrase building engaged the same parts of the brain with similar timing…
“This research shows for the first time that despite obvious physical differences in how signed and spoken languages are produced and comprehended, the neural timing and localization of the planning of phrases is comparable between American Sign Language and English,” says lead author Esti Blanco-Elorrieta, a doctoral student in New York University’s psychology department and the NYU Abu Dhabi Institute.
“Although there are many reasons to believe that signed and spoken languages should be neurobiologically quite similar, evidence of overlapping computations at this level of detail is still a striking demonstration of the fundamental core of human language,” adds senior author Liina Pylkkanen, a professor in New York University’s linguistics and psychology departments.
“We can only discover what is universal to all human languages by studying sign languages,” adds Karen Emmorey, a professor at San Diego State University and a leading expert on sign language.
Past research has shown that structurally, signed and spoken languages are fundamentally similar. Less clear, however, is whether the same circuitry in the brain underlies the construction of complex linguistic structures in sign and speech.
To address this question, the scientists studied the production of multiple two-word phrases in American Sign Language (ASL) as well as speech by deaf ASL signers residing in and around New York and hearing English speakers living in Abu Dhabi.
Signers and speakers viewed the same pictures and named them with semantically identical expressions. In order to gauge the study subjects’ neurological activity during this experiment, the researchers deployed magnetoencephalography (MEG), a technique that maps neural activity by recording magnetic fields generated by the electrical currents produced by our brain.
For both signers and speakers, phrase building engaged the same parts of the brain with similar timing: the left anterior temporal and ventromedial cortices, despite different linguistic articulators (the vocal tract vs. the hands).
Babies know that other languages are for communicating
The researchers point out that this neurobiological similarity between sign and speech, then, goes beyond basic similarities and into more intricate processes—we use the same parts of the brain at the same time for the specific computation of combining words or signs into more complex expressions.
The researchers report their findings in the journal Scientific Reports.
Grants from National Science Foundation, the National Institutes of Health, and the NYUAD Institute, as well as a La Caixa Foundation fellowship for post-graduate studies supported the work.
Source: New York University