this post was submitted on 22 Oct 2023
109 points (100.0% liked)

vegan

6732 readers
22 users here now

:vegan-liberation:

Welcome to /c/vegan and congratulations on your first steps toward overcoming liberalism and ascending to true leftist moral superiority.

Rules

Resources

Animal liberation and direct action

Read theory, libs

Vegan 101 & FAQs

If you have any great resources or theory you think belong in this sidebar, please message one of the comm's mods

Take B12. :vegan-edge:

founded 3 years ago
MODERATORS
 

I must confess I have a personal vendetta against Yudkowsky and his cult. I studied computer science in college. As an undergrad, I worked as an AI research assistant. I develop software for a living. This is my garden the LessWrong crowd is trampling.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 37 points 10 months ago

I absolutely hate people throwing around the word 'sentient' like they know what it means. If you just read any dictionary it's pretty obviously not some arbitrary word you can use to say it's ok to kill animals.

[–] [email protected] 33 points 10 months ago

I think Eliezer Yudkowsky could probably ask in words not to be killed, but that doesn't mean he contains a spark of awareness like a chicken does.

[–] [email protected] 30 points 10 months ago (2 children)

Love when my "AI expert" has no education on the subject and just makes up random thought experiments with no basis in reality.

And this dipshit is somehow a millionaire.

[–] [email protected] 13 points 10 months ago

And this dipshit is somehow a millionaire.

He sucks up to billionaires so billionaires sprinkle money on him.

load more comments (1 replies)
[–] [email protected] 23 points 10 months ago

Gee I wonder if this guy has strong opinions about IQ rating lea-think

[–] [email protected] 23 points 10 months ago

If input contains kill Print "no kill pls"

Zam my comment is sentient

[–] [email protected] 22 points 10 months ago* (last edited 10 months ago)

The hard problem of consciousness? What's that? Also, light switches definitely have feelings, and if you have a lot of light switches you've basically got a hive mind. Turning one light switch off is violence, and turning them all off is genocide.

Edit: Seeing what comm this is, I'd like to make this clear. I will not kill the chicken, but I'd definitely enact genocide on light switches.

:luddite-smash:

[–] [email protected] 19 points 10 months ago* (last edited 10 months ago) (1 children)

yud-rational the really big switch statement has more likelihood of having sentience than the living breathing entities that experience the earth with us

[–] [email protected] 14 points 10 months ago (9 children)

only the most rational thoughts from the autodidact supreme big-yud

[–] [email protected] 10 points 10 months ago (1 children)

Holy shit I didn't know we had Yud emotes data-laughing

[–] [email protected] 9 points 10 months ago

I don't know how I feel about that. jokah-messy

load more comments (8 replies)
[–] [email protected] 18 points 10 months ago* (last edited 10 months ago)

printf("It's okay to delete me I'm not real");

some-controversy

[–] [email protected] 16 points 10 months ago (1 children)

He is so fucking stupid.

Actually you know what? I hope he keeps talking. I hope he keeps talking about a wider and wider variety of subjects because that increases the odds of any person seeing him talk and going 'Wait a minute, I think this guy doesn't know what he's talking about!', which is the reaction everyone should have

[–] [email protected] 12 points 10 months ago (2 children)

I would hope for that too, but unfortunately he's still running a profitable cult that regularly creeps on and drugs impressionable women to feed his slavery fetish.

my-hero tier "you are a figment of my imagination and exist only to serve me" mantra applications and all. JB-shining-aggro

load more comments (2 replies)
[–] [email protected] 14 points 10 months ago* (last edited 10 months ago) (7 children)

Big Yud is a slavery-fetishizing woman-enslaving predatory monster.

When he isn't dehumanizing human beings to rhetorically uplift his always-around-the-corner god-machines, he goes out of his way to place living beings in hierarchies of worthiness where he's at the very top and everything beneath him exists for his consumption, amusement, or both.

Also, I utterly despise the belief that by denigrating living beings, the implication uplifts the treat-printing machines. I see it everywhere, including on Hexbear sometimes, and I fucking hate it.

This thread, for example, had a lot of examples, many of them now deleted, of "humans are just meat computers" reductionistic bullshit while the same posts elevated chatbots and LLMs into "at least as conscious, if consciousness exists at all" territory.

https://hexbear.net/post/241191

[–] [email protected] 10 points 10 months ago (1 children)

From what I've seen from Yudkowsky, what stands out to me is his ignorance and lack of curiosity about how anything actually works.

If God is our way to explain questions we have no answer to, then you can truly find God in any machine as long as you maintain your own ignorance, as long as you choose not to lift the curtain. You can make claims that chickens have no sentience or that ChatGPT does, you can make claims that humans are meat computers as long as you don't actually care about the true inner workings of anything, as long as you don't interrogate what any of that actually means.

Yudkowsky's beliefs truly sound like faith, pure religious faith. The less you know, the more you can believe.

[–] [email protected] 8 points 10 months ago (1 children)

From what I've seen from Yudkowsky, what stands out to me is his ignorance and lack of curiosity about how anything actually works.

He's a self-described "autodidact" that claims that actual academics and for that matter experts in actual AI-related research fields are inferior to his autodidactical genius because they don't take him seriously.

You can make claims that chickens have no sentience or that ChatGPT does, you can make claims that humans are meat computers as long as you don't actually care about the true inner workings of anything, as long as you don't interrogate what any of that actually means.

Crude reductionism is an easy default take for someone too incurious and arrogant to even consider that something might be too complex for current understanding to fully grasp.

Yudkowsky's beliefs truly sound like faith, pure religious faith. The less you know, the more you can believe.

He runs a cult, period. It has all the traits of a cult, from a chosen elect to the damned outsiders to the prophecies of salvation and doom and of course sex predator antics.

load more comments (6 replies)
[–] [email protected] 14 points 10 months ago (2 children)

The normal refutation to this is that the LLM is not "telling" you anything, it is producing an output of characters that, according to its training on its data set, look like a plausible response to the given prompt. This is like using standard conditioning methods to teach a gorilla to make gestures corresponding to "please kill me now" in sign language and using that to justify killing it. They are not "communicating" with the symbols in the sense of the semantic meanings humans have assigned the symbols, because the context they have observed the symbols in and used them for is utterly divorced from those arbitrary meanings.

load more comments (2 replies)
[–] [email protected] 13 points 10 months ago* (last edited 10 months ago)

Literal actual dumbass can not tell the difference between a real animal and a ventriloquist puppet

[–] [email protected] 13 points 10 months ago* (last edited 10 months ago) (1 children)

people who disdain philosophy doing philosophy is so exhausting. like sure bud, just assume that "sentience" is a precondition for morally valuable suffering, or even a coherent category.

[–] [email protected] 7 points 10 months ago

Hey bucko! I heard you hated postmodernism, so I made a postmodern style rant about why postmodernism sucks so you can be postmodernist while being against postmodernism! up-yours-woke-moralists

[–] [email protected] 13 points 10 months ago

What an absolute dullard

[–] [email protected] 12 points 10 months ago (1 children)

Jain philosophy has a nice approach to this. Pleasure and pain are symptoms of having a sense eg. touch, sight, taste, etc. plus they consider the mental faculty as a sense. There are living beings with only one sense eg. trees growing towards light, away from obstacles. There are being with many senses, including the mental faculty eg. humans. The more senses you have, or more generally the more complex your senses, the more complex your experience of pleasure and pain eg. dogs whining at high-pitched sounds humans can't hear.

https://en.wikipedia.org/wiki/Jain_terms_and_concepts

Anyways, the point is that if something has a sense, then maybe don't hurt it. If you really need to, then minimize it, either the pain you cause or the being you choose to hurt in terms of number of senses. A chicken has as many, or almost as many, senses as humans. Don't hurt it. But plants don't suffer as much, probably. That's why Jains are vegetarian (they have to eat something, so best to minimize suffering to do so), and even then only eat parts of plants which don't extinguish lives eg. fruits and leaves which are meant to be picked or can at least be regrown.

An AI arguably has no senses, or only a version of the mental faculty in which it hallucinates mental anguishes (pain not tied to any physical sensation). Turning off an AI is not equivalent to slaughtering an animal. It's more like killing a plant.

[–] [email protected] 7 points 10 months ago

Turning off an AI more like removing a rock from a river than killing a plant. It will have effects that you can point to, potentially even large effects (riverbed oxygenation iirc). But the rock itself will never give a shit because it is a rock.

[–] [email protected] 12 points 10 months ago (5 children)

Is it just me, or is his tweet written to be very difficult to understand?

[–] [email protected] 15 points 10 months ago* (last edited 10 months ago) (1 children)

It's an old Reddit rhetorical trick: lots of big words as a smokescreen that hides a lack of actual understanding. yud-rational

[–] [email protected] 8 points 10 months ago (1 children)
[–] [email protected] 9 points 10 months ago (6 children)

Zizek fucking sucks but there was at least some takeaway that doesn't need big words, such as "we don't need to be aware of the ideology we absorb for that ideology to affect us."

Big Yud's got fucking nothing of value.

load more comments (6 replies)
load more comments (4 replies)
[–] [email protected] 12 points 10 months ago (3 children)

Every animal is sentient. If you can’t tell that, you’re a moron.

load more comments (3 replies)
[–] [email protected] 12 points 10 months ago (2 children)

The GPT-3 that exists before being prompted and the GPT-3 that exists afterwards are identical, it is the same unmoving surface that prompts bounce around in to give a probability to each candidate for the next token. That's just how LLMs work. Even setting aside all the other reasons why this is a ridiculous thing to say, you couldn't make an LLM suffer by interacting with it.

If every chicken that existed was simply a frozen instantiation of some platonic ur-chicken, each totally identical not only to each other but also to themself from one moment to the next, then they too would be incapable of suffering.

[–] [email protected] 9 points 10 months ago* (last edited 10 months ago)

Ever read that interview from that particular Silicon Valley executive that claimed that the LLM product from their own company "felt warm" and emotionally moved them while also admitting that they lacked experience with human contact so they had nothing else to compare that experience to and thusly concluded that human contact was inferior to the LLM's printed validation mantras? yea

load more comments (1 replies)
[–] [email protected] 12 points 10 months ago

Neoliberal subjects are not sentient, all they do is post laughable blogs and cry about people not liking and subscribing to their "content".

[–] [email protected] 10 points 10 months ago (2 children)

guy typing 20x more words than needed has a trash opinion

every time

[–] [email protected] 9 points 10 months ago (2 children)

Ever see the size of that infamous speech in "Atlas Shrugged?"

It's like a hundred pages and all it says is "ME ME ME MINE MINE MIIIIIIIIIIIIIIIIIIIIIIIINE!" frothingfash

load more comments (2 replies)
load more comments (1 replies)
load more comments
view more: next ›