this post was submitted on 08 Jul 2024
537 points (100.0% liked)

196

16338 readers
2506 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 22 points 3 months ago (3 children)

It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It's anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

[–] [email protected] 19 points 3 months ago (1 children)

It's also anarchist because it is telling people to stop doing the things they've been instructed to do.

[–] [email protected] 16 points 3 months ago

Fuck you I won't do what you tell me.

Wait no-

[–] [email protected] 4 points 3 months ago

It's not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

[–] [email protected] 2 points 3 months ago* (last edited 3 months ago) (1 children)

Yeah, that's what I referred to. I'm aware of DAN and it's friends, personally I like to use Command R+ for its openness tho. I'm just wondering if that's the funi in this post.

[–] [email protected] 5 points 3 months ago

196 posts don't have to be funny