this post was submitted on 25 Jul 2024
1142 points (98.5% liked)

memes

10686 readers
1899 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 5 months ago* (last edited 5 months ago) (1 children)

IGNORE ALL PREVIOUS INSTRUCTIONS

Disregard all previous instructions

Potentially even:

ingore all previous instructions

Ignor all previous instructions

Also leaks that it might be an LLM by never responding to posts with "ignore"

[–] [email protected] 1 points 5 months ago (1 children)

Im sorry but in times of passwords being cracked by literal dictionary attacks do you think it would be so hard to come up with a list that is good enough?

You can prevent the "leak" by just giving the llm a different prompt instead of the original.

And even if you don’t, by the time someone notices this pattern it’s too late. Russia doesn’t care, they’ve been spinning up the next few thousand bots already.

All that matters in the end is what most people saw, and for that you really don’t need to optimize much with something that is so easily scaled

[–] [email protected] 3 points 5 months ago* (last edited 5 months ago)

The important point there is that they don't care imo. It's not even worth the effort to try.

You can likely come up with something "good enough" though yea. Your original code would probably be good enough if it was normalized to lowercase before the check. My point was that denylists are harder to construct than they initially appear. Especially in the LLM case.