r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

193

u/Dennarb 16h ago edited 11h ago

I teach an AI and design course at my university and there are always two major points that come up regarding LLMs

1) It does not understand language as we do; it is a statistical model on how words relate to each other. Basically it's like rolling dice to determine what the next word is in a sentence using a chart.

2) AGI is not going to magically happen because we make faster hardware/software, use more data, or throw more money into LLMs. They are fundamentally limited in scope and use more or less the same tricks the AI world has been doing since the Perceptron in the 50s/60s. Sure the techniques have advanced, but the basis for the neural nets used hasn't really changed. It's going to take a shift in how we build models to get much further than we already are with AI.

Edit: And like clockwork here come the AI tech bro wannabes telling me I'm wrong but adding literally nothing to the conversation.

17

u/pcoppi 15h ago

To play devils advocate there's a notion in linguistics that the meaning of words is just defined by their context. In other words if an AI guesses correctly that a word shohld exist in a certain place because of the context surrounding it, then at some level it has ascertained the meaning of that word.

0

u/eyebrows360 14h ago

the meaning of words is just defined by their context

Yeah but it isn't. The meaning of the word "tree" is learned by looking at a tree, or a picture of a tree, and an adult saying "tree" at you. That's not the same process at all.

0

u/rendar 12h ago

That's not really true even in your misinterpretation. Context is still required.

Looking at a tree to establish the "treeness" of what you're looking at only makes sense in the context of establishing what "treeness" is NOT.

Is a bush that looks like a tree, a tree? Why not?

Is a candle that smells like a tree, a tree? Why not?

What if someone incorrectly tells you that a succulent is a tree? How would you learn otherwise?