this post was submitted on 17 Nov 2023
593 points (95.1% liked)

Technology

59672 readers
3194 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

"If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

questionable pictures

We need to keep distinguishing "actual, real-life child-abuse material" from "weird/icky porn". Fediverse services have been used to distribute both, but they represent really different classes of problem.

Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.

Weird/icky porn is not typically illegal, but it's something many people don't want to support or be associated with. Instance owners have a right to say "I don't want my instance used to host weird/icky porn." Other instance owners can say "I quite like the porn that you find weird/icky, please post it over here!"

Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.