pufferfischerpulver

joined 1 month ago
[–] [email protected] 4 points 16 hours ago (1 children)

I don't get the game tbh. At first it was cozy then it turned into work. Which I know people like, some at least. But I work enough in the day to not want to work when I play.

[–] [email protected] 1 points 19 hours ago (3 children)

I'm not sure if you're disagreeing with the essay or not? But in any case what you're describing is in the same vein, that is simply repeating a word without knowing what it actually means in context is exactly what LLMs do. They can get pretty good at getting it right most of the times but without actually being able to learn the concept and context of 'table' they will never be able to use it correctly 100% of the time. Or even more importantly for AGI apply reason and critical thinking. Much like a child repeating a word without much clue what it actually means.

Just for fun, this is what Gemini has to say:

Here's a breakdown of why this "parrot-like" behavior hinders true AI:

  • Lack of Conceptual Grounding: LLMs excel at statistical associations. They learn to predict the next word in a sequence based on massive amounts of text data. However, this doesn't translate to understanding the underlying meaning or implications of those words.
  • Limited Generalization: A child learning "table" can apply that knowledge to various scenarios – a dining table, a coffee table, a work table. LLMs struggle to generalize, often getting tripped up by subtle shifts in context or nuanced language.
  • Inability for Reasoning and Critical Thinking: True intelligence involves not just recognizing patterns but also applying logic, identifying cause and effect, and drawing inferences. LLMs, while impressive in their own right, fall short in these areas.
[–] [email protected] 13 points 23 hours ago (6 children)

Interesting you focus on language. Because that's exactly what LLMs cannot understand. There's no LLM that actually has a concept of the meaning of words. Here's an excellent essay illustrating my point.

The fundamental problem is that deep learning ignores a core finding of cognitive science: sophisticated use of language relies upon world models and abstract representations. Systems like LLMs, which train on text-only data and use statistical learning to predict words, cannot understand language for two key reasons: first, even with vast scale, their training and data do not have the required information; and second, LLMs lack the world-modeling and symbolic reasoning systems that underpin the most important aspects of human language.

The data that LLMs rely upon has a fundamental problem: it is entirely linguistic. All LMs receive are streams of symbols detached from their referents, and all they can do is find predictive patterns in those streams. But critically, understanding language requires having a grasp of the situation in the external world, representing other agents with their emotions and motivations, and connecting all of these factors to syntactic structures and semantic terms. Since LLMs rely solely on text data that is not grounded in any external or extra-linguistic representation, the models are stuck within the system of language, and thus cannot understand it. This is the symbol grounding problem: with access to just formal symbol system, one cannot figure out what these symbols are connected to outside the system (Harnad, 1990). Syntax alone is not enough to infer semantics. Training on just the form of language can allow LLMs to leverage artifacts in the data, but “cannot in principle lead to the learning of meaning” (Bender & Koller, 2020). Without any extralinguistic grounding, LLMs will inevitably misuse words, fail to pick up communicative intents, and misunderstand language.

[–] [email protected] 1 points 23 hours ago

Donlon-san UwU

[–] [email protected] 8 points 1 day ago

It's absolutely fucking insane to me how anyone ever could post this and get elected president of the United States of America. This single post alone would mean political suicide in almost any other western democracy.

It's just - what is happening?? I'm absolutely dumbstruck by this lunacy. I cannot for the life of me understand how someone could look at this dumpster fire of a human being and say "Yes, that's who should represent me in the world and shape the policies of my everyday life." Holy fucking shit what the fuck is happening