this post was submitted on 27 Jan 2024
250 points (89.3% liked)

Not The Onion

12180 readers
849 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
 

cross-posted from: https://mbin.grits.dev/m/mews/t/22301

White House calls for legislation to stop Taylor Swift AI fakes

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 26 points 9 months ago (3 children)

This will be interesting.

How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

[–] [email protected] 12 points 9 months ago

I think it's not feasible to stop or control it, for several reasons -

  1. People are motivated to consume ai porn
  2. There is no barrier to creating it
  3. There is no cost to create it
  4. There are multiple generations of people who have shared the source material needed to create it.

We joke about rule 34 right, if you can think of it there is porn of it. It's now pretty straightforward to fulfil the second part of that, irrespective as to the thing you thought of. Those pics of your granddsd in his 20s in a navy uniform? Your high school yearbook picture? Six shots of your younger sister shared by an aunt on Facebook? Those are just as consumable by ai as tay tay is.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

You write legislation that bans all three because there is no difference between generating, photoshopping or drawing lewds of someone without their consent.

Banning this on an individual level would be impossible, so you let the platforms that host it get sued.

We have the technology to detect if an image is NSFW and if it includes a celebrity. Twitter is letting this happen on purpose.

The images spread across X in particular on Wednesday night, with one hitting 45 million views before being taken down. The platform was slow to respond, with the post staying up for around 17 hours.

It's hard to pretend it wasn't reported by Taylors fans many time during this time and the moderators didn't know about this image half an hour after it was posted.

[–] [email protected] 0 points 9 months ago (1 children)

If the image is even slightly convincing, it's essentially just defamation with digital impersonation thrown in. Yeah, that might catch photoshop in its net, but you'd need to be a DAMN good artist to get caught in it as well.

[–] [email protected] -1 points 9 months ago

So what level is slightly convincing?

What about people that happen to look like someone famous?

What level of accuracy is necessary?

If I label some random blonde ai generated porn “Taylor Slow”, does that count?

They are both blonde after all.