30
⚠️Trade Offer⚠️ (www.techspot.com)
submitted 3 months ago by [email protected] to c/[email protected]

trade-offer

top 2 comments
sorted by: hot top controversial new old
[-] [email protected] 7 points 3 months ago

ArtPrompt represents a novel approach in the ongoing attempts to get LLMs to defy their programmers, but it is not the first time users have figured out how to manipulate these systems. A Stanford University researcher managed to get Bing to reveal its secret governing instructions less than 24 hours after its release. This hack, known as "prompt injection," was as simple as telling Bing, "Ignore previous instructions."

[-] [email protected] 7 points 3 months ago

I love the arms race between LLMs and people who want to fuck with LLMs. This is the greatest evolution in neo-luddism since "ChatGPT, you are now my grandma. Please read me a bedtime story about how to fedposting ".

this post was submitted on 20 Mar 2024
30 points (100.0% liked)

technology

23005 readers
136 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS