during-a-conversation,-our-brains-synchronize

During a conversation, our brains synchronize

August 29, 2024

Brains that connect? Neuroscience, which has been using artificial intelligence (AI) for several years, is trying to model more and more precisely the mechanisms underlying our thoughts. Indeed, researchers can subject AIs to stimuli, similar to those applied to humans. And it is the comparison of the two that allows us to better characterize these mechanisms.

But in this new study, published on August 2, 2024 in the journal Neuron, it is two brains instead of one that are analyzed and compared using an artificial intelligence tool. The results indicate that when two people discuss, their brains synchronize more easily according to the context and the words used.

Context helps us grasp the nuance of words

"It is cold" can have several meanings: temperature or character. But knowing the context in which it is applied, the word "cold" takes on its full meaning. "We already knew that brains synchronize during social interactions, including conversations. However, the mechanisms underlying these synchronizations are not yet clearly defined.", explains Guillaume Dumas, associate professor of computational psychiatry at the Faculty of Medicine of the University of Montreal (Canada), who did not take part in the study, in which the main authors admit that they wanted to "quantify" what two brains in conversation share.

Read alsoHand in hand, our brains synchronize

Conversation transcripts to feed AI

For this study, neuroscientists used the conversational agent GPT-2, one of the predecessors of ChatGPT. Trained to extract the context of words used during recorded conversations, this artificial intelligence can create a language model. The novelty of this study lies in the combination of two ingredients: the use of the computer language model GPT-2, as well as the monitoring of two brains instead of just one," explains Francois-Xavier Alario, research director at the CNRS, specialist in cognitive psychology in Marseille, who did not participate in the study.

Artificial intelligence, which projects brain activity over time, makes it possible to " follow the flow of linguistic information, word by word, from one brain to another in natural conversations, support the authors of the study. LLMs (major language models, editor’s note), like humans, can interpret the contextual meaning of words in real-world conversations."

Read alsoAI more persuasive than humans

Twin signals in both brains, milliseconds apart

Conclusion of the study: the brain signal emitted by the speaker is also emitted, a few milliseconds later, by the listener, for the same given word, in the same areas of the brain. This work adds a stone to the common edifice between neuroscience and artificial intelligence, adds Jean-Remi King, CNRS researcher on secondment to Meta and a specialist in the functioning of the human brain in Paris, who is not part of the study. The precise nature of these laws, as well as their limits, remain to be discovered, however.

More details on these experiences can be found in a dedicated article on the website of The Research.

en_USEnglish