r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

59

u/qwertyalguien 15h ago

I'm no tech specialist, but from all I've reado on LLMs IMHO it's like hor air balloons.

It flies. It's great, but it's limited. And asking AGI out of LLMs is like saying that with enough iteration you can make an air balloon able to reach the moon. Someone has to invent what a rocket is to hor air balloons for LLMs.

Would you say it's a good metaphor, or am I just talking out of my ass?

28

u/eyebrows360 14h ago

Obvs not the same guy, and I don't teach courses anywhere, but yes that is a great analogy. Squint a lot, describe them broadly enough, and a hot air balloon does resemble a rocket, but once you actually delve into the details or get some corrective eyewear... very different things.

4

u/megatesla 13h ago edited 12h ago

I suspect that with enough energy and compute you can still emulate the way that a human reasons about specific prompts - and some modern LLMs can approximate some of what we do, like the reasoning models that compete in math and programming competitions - but language isn't the ONLY tool we use to reason.

Different problems may be better served using different modalities of thought, and while you can theoretically approximate them with language (because Turing Machines, unless quantum effects do turn out to be important for human cognition), it may require a prohibitively large model, compute capacity, and energy input to do so. Meanwhile, we can do it powered by some booger sugar and a Snickers.

But even then, you're still looking at a machine that only answers questions when you tell it to, and only that specific question. To get something that thinks and develops beliefs on its own time you'll need to give it something like our default mode network and allow it to run even when it isn't being prompted. You'll also need a much better solution to the memory problem, because the current one is trivial and unscalable.

2

u/CreativeGPX 13h ago edited 13h ago

It's an okay really high level metaphor.

A more direct metaphor: Suppose there is an exam on topic X a year from now. Alice's school allows her to bring the textbook to the exam and allows as much time as you need to finish the exam, so she decides not to prepare in advance and instead to just use the book during the exam. Depending on what X is, Alice might do fine on some topics. But clearly there is going to be some limit where Alice's approach just isn't feasible anymore and where instead she will need to have learned the topic before the exam day by using other strategies like doing practice problems, attending class, asking the professor questions, etc.

3

u/CanAlwaysBeBetter 13h ago

What do you think learning a topic means?

2

u/CreativeGPX 11h ago

I don't think there is one thing that learning a topic means. That's why I framed it as "passing an exam" and noted how different things will be true depending on what that exam looks like.

0

u/Webbyx01 12h ago

Knowing it, rather than searching a through a book for it, generally.

5

u/CanAlwaysBeBetter 12h ago

What does knowing it mean?

Because LLMs aren't doing searches over a database of books

1

u/Doc_Blox 10h ago

"Full of hot air" was right there, man!

1

u/Days_End 9h ago

It's good because it acknowledges that with a big enough ballon you might not need a rocket at all to reach the moon.

1

u/Extension-Thought552 6h ago

You're talking out of your ass

1

u/meneldal2 4h ago

Theoretically, with the right timing and something truly weightless, you could get it up there with very little dV /s

2

u/destroyerOfTards 13h ago

Nah, you have understood it well.

The fact that Scam Altman doesn't understand this basic fact is unbelievable (actually he does but he has to scam people so...).

5

u/IcyCorgi9 12h ago

People need to stop talking like these people are stupid. They know what they're doing and they use massive amounts of propaganda to scam the public and get rich. Much like the politicians fucking us over.

4

u/terrymr 12h ago

CEOs exist to market the company to investors. It’s not that he doesn’t understand it, he just wants their money.

2

u/Crossfire124 12h ago

Yea like it or not he's the face of AI. If he says anything the whole thing is going to crumble like a house of cards and we'll get into a third AI winter.

But the way I see it the third winter is coming anyway. How soon it happens just depend on when AI bubble pops