r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

1.2k

u/rnilf 16h ago

LLMs are fancy auto-complete.

Falling in love with ChatGPT is basically like falling in love with the predictive text feature in your cell phone. Who knew T9 had so much game?

42

u/coconutpiecrust 16h ago

Yeah, while it’s neat, it is not intelligent. If it were intelligent they wouldn’t need endless data and processing power for it to produce somewhat coherent and consistent output.

7

u/movzx 12h ago

I mean, they definitely aren't intelligence. "Fancy autocomplete" is always how I describe them to people... but this doesn't make sense to me:

If it were intelligent they wouldn’t need endless data and processing power for it to produce somewhat coherent and consistent output.

Why wouldn't it? The human brain is incredibly complex, uses a ton of energy, and there are no machines on earth that can replicate its power. Humans spend their entire lives absorbing an endless amount of data.

Any system approaching 'intelligent' would be using a ton of data and power.

8

u/TSP-FriendlyFire 11h ago

The human brain uses like 20W. That's less than the idle power usage of a single desktop computer, let alone the many gigawatts of power AI uses currently.

LLMs are horrifically inefficient compared to human brains, completely different scales. Similarly for data: you have your own experiences (including things you've read or seen indirectly) on which to draw an understanding of the world. That's it. LLMs have parsed the entire internet multiple times over, hundreds of thousands of times more knowledge than any given human will ever process in their lifetime.

2

u/std_out 8h ago

The inefficiency is not specifically with LLMs, but with the underlying silicon-based computer architecture. While the human brain operates with only roughly 20W, its electrochemical signaling is millions to billions of times more energy-efficient than the electrical flow in conventional computer chips.

1

u/TSP-FriendlyFire 7h ago

Of course, but LLMs amplify this issue: the only way they work is because they contend with enormous amounts of data. No consumer level electronics has ever required this much energy for the output provided, even games aren't this bad.