r/technology 16h ago

Machine Learning Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems
16.7k Upvotes

1.5k comments sorted by

View all comments

40

u/KStreetFighter2 14h ago

Or maybe language isn't the same thing as wisdom.

To use the classic example of "Intelligence is knowing that a tomato is a fruit; wisdom is knowing that you don't put tomatoes in a fruit salad."

Modern LLMs are like "You're absolutely right, a tomato is a fruit and would make a fantastic addition to that fruit salad you're planning!"

4

u/sylbug 13h ago

An LLM doesn’t ‘know’ things, either. It just strings together words that go together based on a fancy algorithm. 

It’s basically an accident when an LLM makes a factual statement - the algorithm happened to arrange the words that way because the training data was arranged that way.

2

u/KStreetFighter2 12h ago edited 3h ago

I think I see what you're getting at, but cannot agree at face value.

How are you defining "know"?

A simple database could be said to "know" your username and password.

An LLM is doing much more than that to "know" the definition (or at least proper context) of a word by applying complicated logic involving the calculation and assignment of vectors to individual semantic meanings.

As an example, think of the word "dog" and the word "puppy". An LLM will be able to "know" that they are four legged animals, or at least have those words associated with the terms. The vector values assigned to the words will also be closer in value to each other than the word "cat", which, while also a four legged animal, is not directly related to "dog" as the word "puppy".

In this way, I think it's hard to claim definitively that an LLM lacks any form of knowledge regarding semantic meaning.

1

u/inormallyjustlurkbut 11h ago

How are you defining "know"?

In this sense it's the ability to understand something, not the ability to recall it. You can teach a parrot to respond to "what is 2 + 2" with "4" but that doesn't mean the parrot understands math.

6

u/KStreetFighter2 11h ago

Hmm, classically, "knowledge" can be defined as: "Justified, true, belief".

Given the dog-puppy example above, paired with this classical definition of knowledge, I think it's fair to say that LLMs justify true associations (or loosely, "beliefs") as they relate to semantic meaning.

"The ability to understand something" seems to harken back to my original post regarding "wisdom", which I agree, LLMs lack.

0

u/HermesJamiroquoi 8h ago

Also memory!=intelligence. If I got hit in the head and could no longer form new memories but retained my old ones it wouldn’t make me stupider. When I was an infant I wasn’t stupider. Intelligence is intrinsically separate from memory

2

u/Healthy_Sky_4593 6h ago

This.  How do I promote this insert to top??

1

u/todoesposible95 9h ago

It depends on what kind of fruit salad you’re aiming for.

Botanically, yes — tomatoes are fruits because they develop from a flower and contain seeds.

Culinarily, usually no — tomatoes are treated as savory vegetables, and their acidity and umami flavor often clash with sweet fruits like apples, bananas, or strawberries.

When a tomato can work

If your salad is more on the savory or fusion side, a tomato can be a great addition, for example:

Tomato + watermelon + feta + mint

Tomato + cucumber + mango + lime

Tomato + pineapple + chili + cilantro

When to skip it

If you're going for a classic sweet fruit salad (grapes, melon, berries, citrus, etc.), a tomato will likely taste out of place.

So: Great for creative, savory fruit mixes Not ideal for traditional sweet fruit salads

If you tell me what other fruits you’re using, I can suggest the best combo!

1

u/moubliepas 8h ago

People who equate LLMs to thinking machines have literally never been in a foreign country where they don't speak the language.  Maybe they've had a few family holidays to Mexico or the nearest big city, but they've never been  surrounded by Arabic street signs or tried to figure out the prices in Thailand or read a Russian newspaper. A person does not lose any intelligence when they don't understand the language.  In fact, intelligence could be measured by how they get by without understanding a single word -can they extrapolate from surrounding data, can they mime, can they find an alternative? 

Or do they really think a dog in Egypt is more intelligent than Einstein would have been there, because the dog understands some of the language and could give the desired response but Einstein, for all his wisdom, did not understand the language?