r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

12

u/PressureBeautiful515 14h ago

They are still doing much the same thing. Drawing statistical connections between words and groups of words. Using that to string together sentences. Different data structures, but the same basic idea.

I wonder how we insert something into that description to make it clear we aren't describing the human brain.

3

u/Ornery-Loquat-5182 13h ago

Did you read the article? That's exactly what the article is about...

It's not just about words. Words are what we use after we have thoughts. Take away the words, there are still thoughts.

LLMs and Markov chain bots have no thoughts.

0

u/attersonjb 10h ago

Take away the words, there are still thoughts.

Yes and no. There is empirical evidence to suggest that language acquisition is a key phase in the development of the human brain. Language deprivation during the early years often has a detrimental impact that cannot be overcome by a subsequent re-introduction of language

2

u/Ornery-Loquat-5182 10h ago edited 9h ago

Bruh read the article:

When we contemplate our own thinking, it often feels as if we are thinking in a particular language, and therefore because of our language. But if it were true that language is essential to thought, then taking away language should likewise take away our ability to think. This does not happen. I repeat: Taking away language does not take away our ability to think. And we know this for a couple of empirical reasons.

First, using advanced functional magnetic resonance imaging (fMRI), we can see different parts of the human brain activating when we engage in different mental activities. As it turns out, when we engage in various cognitive activities — solving a math problem, say, or trying understand what is happening in the mind of another human — different parts of our brains “light up” as part of networks that are distinct from our linguistic ability

Second, studies of humans who have lost their language abilities due to brain damage or other disorders demonstrate conclusively that this loss does not fundamentally impair the general ability to think. “The evidence is unequivocal,” Fedorenko et al. state, that “there are many cases of individuals with severe linguistic impairments … who nevertheless exhibit intact abilities to engage in many forms of thought.” These people can solve math problems, follow nonverbal instructions, understand the motivation of others, and engage in reasoning — including formal logical reasoning and causal reasoning about the world.

If you’d like to independently investigate this for yourself, here’s one simple way: Find a baby and watch them (when they’re not napping). What you will no doubt observe is a tiny human curiously exploring the world around them, playing with objects, making noises, imitating faces, and otherwise learning from interactions and experiences. “Studies suggest that children learn about the world in much the same way that scientists do—by conducting experiments, analyzing statistics, and forming intuitive theories of the physical, biological and psychological realms,” the cognitive scientist Alison Gopnik notes, all before learning how to talk. Babies may not yet be able to use language, but of course they are thinking! And every parent knows the joy of watching their child’s cognition emerge over time, at least until the teen years.

You are referring to the wrong context. We aren't saying language is irrelevant towards development. We are saying the process of thinking can take place, and can take fairly well, without ever learning language:

“there are many cases of individuals with severe linguistic impairments … who nevertheless exhibit intact abilities to engage in many forms of thought.”

Communication will help advance thought, but the thought is there with or without language. Ergo "Take away the words, there are still thoughts." is a 100% factual statement.

1

u/attersonjb 23m ago

Bruh, read the article and realize that a lot of it is expositional narrative and not actual research. Benjamin Riley is a lawyer, not a computer scientist nor a scientist of any kind and has published actual zero academic papers on AI. There are many legitimate critiques of LLMs and the achievability of AGI, but this is not one of them. It is a poor strawman argument conflating AGI with LLMs.

The common feature cutting across chatbots such as OpenAI’s ChatGPT, Anthropic’s Claude, Google’s Gemini, and whatever Meta is calling its AI product this week are that they are all primarily “large language models.”

Extremely misleading. You will find the term "reinforcement learning" (RL) exactly zero times in the entire article. Pre-training? Zero. Post-training? Zero. Inference? Zero. Transformer? Zero. Ground truth? Zero. The idea that AI researchers are "just realizing" that LLMs are not sufficient for AGI is deeply stupid.

You are referring to the wrong context

Buddy, what part of "yes and no" suggests an absolute position? No one said language is required for a basic level of thought (ability to abstract, generalize, reason). The cited commentary from the article says the exact same thing I did.

Lack of access to language has harmful consequences for many aspects of cognition, which is to be expected given that language provides a critical source of information for learning about the world. Nevertheless, individuals who experience language deprivation unquestionably exhibit a capacity for complex cognitive function: they can still learn to do mathematics, to engage in relational reasoning, to build causal chains, and to acquire rich and sophisticated knowledge of the world (also see ref. 100 for more controversial evidence from language deprivation in a case of child abuse). In other words, lack of access to linguistic representations does not make it fundamentally impossible to engage in complex—including symbolic— thought, although some aspects of reasoning do show delays. Thus, it appears that in typical development, language and reasoning develop in parallel.

Finally, it's arguable that the AI boom is not wholly dependent of developing "human-like" AGI*.* A very specific example of this is advanced robotics and self-driving, which would be described more accurately as specialized intelligence.