this post was submitted on 21 Nov 2024
162 points (97.6% liked)

Technology

59651 readers
4737 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 4 days ago (1 children)

The model I use (I forget the name) popped out something pretty sus once. I wouldn't describe it as CP, but it was definitely weird enough to really make me uncomfortable. It's the only thing it ever made that I immediately deleted and removed from the recycling bin too lol.

The point I'm making is that this isn't as far fetched as you believe.

Plus, you can merge models. Get a general purpose model that knows what children look like, a general purpose pornographic model, merge them, then start generating and selecting images based on Thorn's classifier.

[–] [email protected] 2 points 4 days ago (1 children)

You can't merge a generative model and a classification model. You can run then in series to get a bunch of false positives/hallucinations, but you can't make it generate something from the other model.

[–] [email protected] 1 points 4 days ago

When I said a "general purpose model that knows what children look like" I didn't mean the classification model from the article. I meant a normal, general purpose image generation model. When I said "that knows what children look like" I mean part of its training set is on children, because it's sort of trained a little on everything. When I said "pornographic model" I mean a model trained exclusively on NSFW content (and not including any CSAM, but that may be generous depending on how much care was out into the model's creation).