When we converse our brain waves synch up; where exactly depends on the language we’re using

Alejandro Perez
Research by U of T Scarborough postdoc Alejandro Pérez looks at interbrain neural coupling, which is essentially how people's brains synch up when they converse. (Photos by Don Campbell)

Don Campbell

When two people converse, their brain waves manage to get in synch.

And according to a new U of T Scarborough-led study, the same happens when people speak their non-native tongue, just in different areas of the brain.

“You can tell two people are having conversation just by looking at their brain waves, because they align with each other,” says Alejandro Pérez, a postdoc in the Centre for French and Linguistics and the Department of Psychology at U of T Scarborough.

The study adds to a growing body of research on something called interbrain neural coupling, which is essentially how two people’s brain waves synch up when they hold a conversation.

Past research co-authored by Pérez found that when two people are conversing, neurons firing in specific areas of the speaker’s brain show a corresponding pattern of activity in the neurons in specific areas of the listener’s brain. Not only are specific areas in the brain activated, there’s a pattern in the timing of that activation as well.

For this particular study an international team of researchers led by Pérez wanted to see if and where in the brain this synchronization takes place in those speaking their non-native language.

To do this the brain activity of 60 participants (all native Spanish speakers with some proficiency in English) was recorded using electroencephalography (EEG) while conversations took place. Half of the conversations took place entirely in Spanish, while the other half entirely in English.

What they found is that when the participants conversed in their second language, the activations took place in different areas of the brain.   

“The language used influenced the alignment of brain waves between those having the conversation, and this suggests that effective communication could be based on this interbrain neural coupling,” says Pérez.  

Pérez says the reasons for this difference in pattern could come down to attention. For those conversing in a non-native language, attention to the message is made in smaller chunks since they are struggling to precisely understand and produce every single word compared to those conversing in their native language.

He adds it could also be that emotional attachment is different among native and non-native language speakers to certain words, or the capacity to form a mental image of words is different between someone conversing in their first or second language. Pérez says non-native speakers also learn the language later in life so they never quite master the structures of that language in the same way as someone who grew up with it.

Alejandro Perez and Phil Monahan
Associate Professor Philip J. Monahan and postdoc Alejandro Pérez at the Computation and Psycholinguistics Laboratory at U of T Scarborough.  

“This work gets at a foundational issue, which is pretty remarkable, and that is how our brains seem to synchronize when we hold a conversation,” says Pérez’s co-supervisor Philip J. Monahan, Associate Professor in the Centre for French and Linguistics at U of T Scarborough.

“In the past we mostly studied production and perception of speech independently of each other, often using very simple syllables or sentences. Rarely did we look at the link between the two, especially in a naturalistic way using entire conversations.”

Monahan, who is an expert on psycho- and neurolinguistics, says the research could be particularly important in multicultural societies where many people are often speaking more than one language, or switching from one language at home to another in public.

“Being able to understand how the brain goes between different linguistic structures and copes with the differences could really help us understand how the brain supports multiple linguistic systems in the first place,” says Monahan.  

While this study looked at two languages that are relatively similar, Monahan says it would be interesting to see how the brain switches between vastly different languages, say English and Tamil.     

Pérez says it’s also conceivable that measuring brain activity could one day offer an objective measure of the quality of a conversation. It may also be able to tell us whether both parties fully understood each other based on their brain waves synchronize during the conversation. 

“Imagine two people talking over Skype and they’re hooked up to a simple EEG device,” he says. “The data gathered may offer some indication about the quality of the communication between two people depending on the alignment of their brain activity.” 

The research, which is published in the journal Cortex, received funding the Natural Sciences and Engineering Research Council (NSERC) and the Social Sciences and Humanities Research Council (SSHRC).