this post was submitted on 21 May 2024
78 points (97.6% liked)

Technology

1441 readers
702 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 9 points 6 months ago (1 children)

The basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?

Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it's just stick figures, and I've labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it's CSAM?

It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it's not any real person?

This seems like a very bad path to head down.