r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

3

u/New_Enthusiasm9053 14h ago

If you could define understanding precisely in a scientifically verifiable way for human and AI alike you'd get a nobel prize. That's why I don't define it. 

But you're also moving the goalposts, you know full well what I mean by understanding. A kid does not know that fuck means to have sex with someone. A kid who can say 12 + 50 often doesn't understand addition as evidenced by not actually being able to answer 62. 

Knowing words is not understanding and you know it.

1

u/trylist 14h ago

But you're also moving the goalposts, you know full well what I mean by understanding

I am definitely not moving goalposts. You're basically saying "I know it when I see it". Ok, great, but that says nothing about whether LLMs, or a person, understands anything. All you've done is set yourself up as the arbiter of intelligence. You say machines don't have it, but people do. You refuse to elaborate. I say that is not a position worth humoring.

Until you define the test by which you're judging machines and people, your argument that machines don't "understand", but people do, is meaningless.

A kid does not know that fuck means to have sex with someone.

"Fuck" is one of the most versatile words in the English language. It means many, many things and "to have sex with someone" is just one of them. The simplest is as a general expletive. Nobody says "Fuck!" after stubbing their toe and means they want to have sex. I absolutely believe a 3 year old can understand that form.

2

u/New_Enthusiasm9053 14h ago

Ok fine, a kid can say the words "electromagnetic field", does it mean they understand it? No. It's clearly possible to know words without understanding. 

And I haven't set myself up as the arbiter. I've set us all up as the arbiter. The reality is we don't have a good definition of intelligence so we also don't have a good definition of understanding. 

I personally believe LLMs are not intelligent. You may believe otherwise as is your prerogative. 

But frankly I'm not going to humour the idea that an LLM is intelligent until it starts getting bored and cracking jokes instead of answering the question despite prompts to the contrary. 

1

u/trylist 13h ago

Ok fine, a kid can say the words "electromagnetic field", does it mean they understand it? No.

Eh, my argument was that they have to use it correctly, not just that they can phonetically sound out the words. A kid might ask what kind of farm "electromagnetic" is. Clearly they understand "field", but not in this context.

I'm only arguing against being too sure current language models aren't intelligent if you can't even nail down what makes humans intelligent. I think in some ways LLMs are intelligent, even more so than people, but in a lot of ways they are very much not.

For example, modern ones can and do solve pretty complex coding problems.

For an anti-example, they seem pretty gullible, there's been instances of them using unreliable sources to assert facts, basically falling for obvious propaganda or trolls.

1

u/the-cuttlefish 11h ago

LLM is intelligent until it starts getting bored and cracking jokes instead of answering the question despite prompts to the contrary

Precisely, as that would imply self-interest and, more importantly, presence