this post was submitted on 13 Apr 2024
459 points (98.3% liked)

Privacy

30843 readers
597 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

Here's a non-paywalled link to an article published in the Washington Post a few days ago. It's great to see this kind of thing getting some mainstream attention. Young children have not made an informed decision about whether they want their photos posted online.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 105 points 4 months ago (3 children)

Interesting how there are so many mentions of people worried about AI and only sharing photos in closed groups on Instagram/Facebook. I'm not sure that's actually keeping the photos away from AI.

[–] [email protected] 35 points 4 months ago

I think a large part of their concern is AI-altered photos generated by an individual.

[–] Blizzard 18 points 4 months ago

Came here to say this. If you upload pictures to instagram, they are already being processed by Facebook ("Meta"). If you have an online backup of your photos Google/Apple cloud, then they are alredy being processed.

[–] [email protected] 7 points 4 months ago (1 children)

The problem with posting pictures of kids in closed groups is that pervs will just join those groups because they have what they're looking for. You're basically making it easier for them.

It's not that parents are afraid of their kids being part of a training set, though that is a bad thing in and of itself. It's more about all of these AI undressing app ads that are showing up on every social media site, showing just how much of a wild-west situation things currently are, and that this brand of sexual exploitation is in-demand.

Predators are already automating the process so that certain Instagram models get the AI undressing treatment as soon as they upload an exploitable pic. Pretty trivial to do at scale with Instaloader, GroundingDINO, SAM, and SD. Those pics are hosted outside of Instagram where victims have no power to undo the damage. Kids will get sexually exploited in this process, incidentally or intentionally.

[–] [email protected] 3 points 4 months ago

I believe by closed groups they mean the family or friends chat with like 5 people.

Although I personally wouldn't share too much in those groups too.