this post was submitted on 07 Mar 2024
299 points (92.4% liked)

Memes

1120 readers
8 users here now

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 5 months ago* (last edited 5 months ago) (21 children)

Oh no we are NOT doing this shit again. It's literally autocomplete brought to its logical conclusion, don't bring your stupid sophistry into this.

[–] [email protected] 1 points 5 months ago (9 children)

Your brain is just a biological system that works somewhat like a neural net. So according to your statement, you too are nothing more than an auto complete machine.

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (5 children)

I'm starting to wonder if any of you even know how that shit even works internally, or if you just take what the hype media says at face value. It literally has one purpose and one purpose alone: Determine what the next word is going to be by calculating the probability which word will come after the next. That's it. All it does is try to string a convincing sentence using probabilities. It does not and cannot understand context.

The underlying tech is really cool but a lot of people are grotesquely overselling its capabilities. Not to say a neural network can't eventually obtain consciousness (because ultimately our brains are a union of a bunch of little neural networks working together for a common goal) but it sure as hell isn't going to be an LLM. That's what I meant by sophistry, they're not engaging with the facts, just some nebulous ideal.

[–] [email protected] 2 points 5 months ago

I'm with you on LLMs being over hyped although that's already dying down a bit. But regarding your claim that LLMs cannot "understand context", I've recently read an article that shows that LLMs can have an internal world model:

https://thegradient.pub/othello/

Depending on your definition of "understanding" that seems to be an indicator of being more than a pure "stochastic parrot"

load more comments (4 replies)
load more comments (7 replies)
load more comments (18 replies)