For thousands of years, philosophers have argued about the purpose of language. Plato believed it was essential for thinking. Thought "is a silent inner conversation of the soul with itself," he wrote.
Many modern scholars have advanced similar views. Starting in the 1960s, Noam Chomsky, a linguist at the Massachusetts Institute of Technology, argued that we use language for reasoning and other forms of thought. "If there is a severe deficit of language, there will be severe deficit of thought," he wrote.
As an undergraduate, Evelina Fedorenko took Chomsky's class and heard him describe his theory. "I really liked the idea," she recalled. But she was puzzled by the lack of evidence. "A lot of things he was saying were just stated as if they were facts -- the truth."
Fedorenko went on to become a cognitive neuroscientist at MIT, using brain scanning to investigate how the brain produces language. And after 15 years, her research has led her to a startling conclusion: We don't need language to think.
"When you start evaluating it, you just don't find support for this role of language in thinking," she said.
When Fedorenko began this work in 2009, studies had found that the same brain regions required for language were also active when people reasoned or carried out arithmetic. But Fedorenko and other researchers discovered that this overlap was a mirage. Part of the trouble with the early results was that the scanners were relatively crude. Scientists made the most of their fuzzy scans by combining the results from all their volunteers, creating an overall average of brain activity.
In her own research, Fedorenko used more powerful scanners and ran more tests on each volunteer. Those steps allowed her and her colleagues to gather enough data from each person to create a fine-grained picture of an individual brain.
The scientists then ran studies to pinpoint brain circuits that were involved in language tasks, such as retrieving words from memory and following rules of grammar. In a typical experiment, volunteers read gibberish, followed by real sentences. The scientists discovered certain brain regions that became active only when volunteers processed actual language.
Each volunteer had a language network -- a constellation of regions that become active during language tasks.
"It's very stable," Fedorenko said. "If I scan you today, and 10 or 15 years later, it's going to be in the same place."
The researchers then scanned the same people as they performed different kinds of thinking, such as solving a puzzle.
"Other regions in the brain are working really hard when you're doing all these forms of thinking," she said. But the language networks stayed quiet. "It became clear that none of those things seem to engage language circuits."
In a paper published Wednesday in Nature, Fedorenko and her colleagues argued that studies of people with brain injuries point to the same conclusion.
Strokes and other forms of brain damage can wipe out the language network, leaving people struggling to process words and grammar, a condition known as aphasia. But scientists have discovered that people can still do algebra and play chess even with aphasia. In experiments, people with aphasia can look at two numbers -- 123 and 321, say -- and recognize that, by using the same pattern, 456 should be followed by 654.
If language is not essential for thought, then what is language for? Communication, Fedorenko and her colleagues argue. Chomsky and other researchers have rejected that idea, pointing out the ambiguity of words and the difficulty of expressing our intuitions out loud. "The system is not well designed in many functional respects," Chomsky once said.
But large studies have suggested that languages have been optimized to transfer information clearly and efficiently.
In one study, researchers found that frequently used words are shorter, making languages easier to learn and speeding the flow of information. In another study, researchers who investigated 37 languages found that the rules of grammar put words close to each other so that their combined meaning is easier to understand.
Kyle Mahowald, a linguist at the University of Texas at Austin who was not involved in the new work, said separating thought and language could help explain why artificial intelligence systems like ChatGPT are so good at some tasks and so bad at others.
Computer scientists train these programs on vast amounts of text, uncovering rules about how words are connected. Mahowald suspects that these programs are starting to mimic the language network in the human brain -- but falling short on reasoning.
"It's possible to have very fluent grammatical text that may or may not have coherent underlying thought," Mahowald said.
Fedorenko noted that many people intuitively believe that language is essential to thought because they have an inner voice narrating their every thought. But not everyone has this ongoing monologue. And few studies have investigated the phenomenon.
"I don't have a model of this yet," she said. "I haven't even done what I would need to do to speculate in this way."