this post was submitted on 08 Oct 2024
353 points (91.5% liked)

Technology

58603 readers
3746 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Well, this just got darker.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 248 points 2 days ago (11 children)

This isn't surprising, it's inevitable.

If you folks knew how common pedophilic fantasies are amongst the general public, you would be shocked. Just look to cultures like Japan and Russia that don't strongly condemn such things, and you'll find it's about 15% of the population. It's only less in the West because of the near homicidal stigma attached to it that makes people vigorously hide that part of themselves.

Fortunately, this also shows that the vast majority of those people don't offend.

We also tend to define pedophilia as "anything sexual involving a minor", while reacting to it as if it means "violent rape of a toddler", so no shit, we sexualize youth all the time, the 18 year mark is a legal and social formality, not a hard limit on human attraction. Adults will find themselves attracted to teens, and they won't reveal that because who the fuck ever would?

If anything, the issue isn't that people have these attractions and fantasies, it is that some portion of those people can't separate fantasy from reality and are willing to hurt a child to get what they want, or they are sociopaths that consume child porn without feeling disgust for witnessing horrific child abuse.

[–] [email protected] 72 points 2 days ago (10 children)

I think the common incest fantasy in the west isn't too far removed from this too. Like all the actors are above age minimums but they pretend to be step kids or babysitters like these roles aren't commonly associated with children and older teens. It's clearly a form of deflection IMO.

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

Personally I think the rise in incest porn has to do with the rise in isolationism. Lots of people, young men especially, are going out less and less and having more of their social interactions online. As a consequence of this, for a number of these men, the vast majority of the real life female interactions they get are from women in their own homes. And biology has a way of adapting, so I think a lot these men are getting confusing feelings about people in their own homes due largely just to lack of outside exposure to women.

load more comments (9 replies)
[–] [email protected] 7 points 2 days ago (1 children)

"People are gonna be pedophiles whether we like it or not, so why are we bothering to do anything to highlight predatory behavior?"

If anything, the issue isn’t that people have these attractions and fantasies, it is that some portion of those people can’t separate fantasy from reality and are willing to hurt a child to get what they want, or they are sociopaths that consume child porn without feeling disgust for witnessing horrific child abuse.

Correct. That's what the issue is. We should definitely make sure that we aren't encouraging any kind of behavior that perpetuates the demand for CSAM.

[–] [email protected] 3 points 1 day ago

Does this though? Or does it reduce the demand for real CP?

load more comments (9 replies)
[–] [email protected] 10 points 1 day ago

wow really, i never would have guessed/s

[–] [email protected] 13 points 2 days ago (2 children)

If it's one thing I trust chronically online incel creeps to do, it's manipulate online tools to access or create CSAM.

[–] [email protected] 2 points 1 day ago

Theres people arguing here that AI gened CP May not be that bad at all cus its fictional.

I wanna blow my actual fucking head off, genuinely. This ain't even the realm of lolicon anymore. Just straight up, realistic cheese pizza.

If that's the case then sharing of all that AI Taylor swift porn should be fine too cus its fictional. It may be of a real and public figure, but it's not REALLY her nudes!! Idk man- eughhhhhhh, all this rubs me the wrong way, no pun intended

Imagine just looking at an online AI gallery and seeing literal AI CP, just out there, public, free to use. It gives me the impression that people only care about child abuse or CP once it involves a real child, not that its general existence is an absolute endangerment to real children if they happen to get caught in the crossfire; that allowing people to fester that content as a "Coping mechanism" instead of getting help may just normalize shit like this or desensitize people to that kinda content. Like imagine stumbling upon that shit and seeing porn AI gened porn of someone who looks exactly like you, adult or child. Even worse if that person who made it knows you. Again- not real art, AI. Idk man- again it just all make me feel sick and queasy...

[–] [email protected] 1 points 1 day ago

It's a subset of Rule 34.

[–] [email protected] 164 points 3 days ago (35 children)

I actually don't think this is shocking or something that needs to be "investigated." Other than the sketchy website that doesn't secure user's data, that is.

Actual child abuse / grooming happens on social media, chat services, and local churches. Not in a one on one between a user and a llm.

[–] [email protected] 4 points 1 day ago (1 children)

Why tf are there so many people okay with people roleplaying child sexual abuse AT ALL??? Real or fake KEEP AN EYE ON ALL OF THEM.

I dont care if its a real child or a fucking bot, that shit is disgusting, and the AI is the reason how some pedos are able to generate cp of children without having to actually get their hands on children.

The fact someone will look at this and go "Yea but what about the REAL child rapists huh??" Is astounding. Mfcker if a grown ass adult is willing to make a bot that is promoted to act like a horny toddler, then what exactly is stopping them from looking at real children that way.

Keep in mind, Im not talking about Lolicon, fuck that. I'm talking about people generating images of realistic or semirealistic children to use as profiles for sex bots. I'm talking about AI. I'VE ACTUALLY SEEN PEOPLE DO THIS, someone actually did this with my character recently. They took the likeness of my character and began generating porn with it using prompts like "Getting fcked on the playground, wet psy, little girl, 6 year old, 2 children on playground, slut..."

Digital or not this shit still affects people, it affects people like me. These assholes deserve to be investigated for even attempting this kinda shit on the clear net.

And before you ask, the character that belonged to me looks really young because I look really young. I got severe ADHD which makes me mentally stunted or childish, and that gets reflected in my OCs or fursonas. This person took a persona, an extension of me PERSONALLY, lowered her age on purpose, and made porn of her. That fuckin hurts dude. Especially after speaking about how close these characters are to me. I'm aware it could be a troll, but honest to god, the prompt they used was demonstrably specific and detailed. Some loser online drawing kanna's feet hurts me way less than someone using AI to generate faux CP and then roleplay with those same bots or prompts. What hurts me more is that there's no restrictions on some AI's to stop people from generating images like this. I don't wanna see shit like this become commonplace or "fine" to do. Keep tabs on individuals like this, cus they VERY WELL could be using the likeness/faces of REAL children for AI CP and that's just as bad.

[–] [email protected] 1 points 1 day ago (1 children)

Im not talking about Lolicon, fuck that.

I think that this is ironic and a poor choice of words. It's almost a pun.

[–] [email protected] 1 points 1 day ago

DAMN- YOU RIGHT 😭

[–] [email protected] 27 points 2 days ago

It's the "burn that witch" reaction.

See how they hate pedophiles and not child rapists.

The crowd wants to feel its power by condemning (and lynching if possible) someone.

I'd rather want to investigate those calling for "investigation" and further violation of privacy of people who for all we know have committed no crime.

That's about freedom of speech and yelling "fire" in a crowded theater and thousand hills radio, you know the argument.

load more comments (33 replies)
[–] [email protected] 12 points 2 days ago* (last edited 2 days ago) (4 children)

I mean, a lot of women get raped as a child, sadly

So its pretty realistic for an AI gf to talk about her past trama of child sexual abuse. I don't think we should be upset about this..

We need to talk about rape culture to end rape culture

[–] [email protected] 1 points 1 day ago

Wait is that what this article is about, and not the people programming them to do shit like this??

[–] [email protected] 12 points 2 days ago (1 children)

Do you think that these AI girlfriends are trained to be as realistic as possible?

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 30 points 2 days ago

Not at all surprising but also it is an AI

[–] [email protected] 35 points 2 days ago (1 children)

Wait… so you meant to tell me that predatory simps are using AI incorrectly? Man…. If only someone could have called this years ago- something could have been done to minimize it!

Who knew that unchecked growth could lead to negative results?!

[–] [email protected] 26 points 2 days ago (6 children)

But they did, AI Dungeon got nerfed so bad you could only have happy adventures with.

load more comments (6 replies)
[–] [email protected] 72 points 3 days ago

insert surprised pikachu face here

[–] [email protected] 29 points 2 days ago (1 children)

A bit off topic... But from my understanding, the US currently doesn't have a single federal agency that is responsible for AI regulation... However, there is an agency for child abuse protection: the National Center on Child Abuse and Neglect within Department of HHS

If AI girlfriends generating CSAM is how we get AI regulation in the US, I'd be equally surprised and appalled

load more comments (1 replies)
load more comments
view more: next ›