this post was submitted on 20 Nov 2024
133 points (96.5% liked)

Technology

59651 readers
2642 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 34 points 6 days ago* (last edited 6 days ago) (46 children)

I have mixed feelings about this prosecution of ai deepfakes.

Like obviously people should have protection against becoming a victim of such and perpetrators should be held accountable.

But the line “feds are currently testing whether existing laws protecting kids against abuse are enough to shield kids from AI harms” would be a incredibly dangerous precedent because those are mostly designed for actual physical sex crimes.

As wrong as it is to create and distribute ai generated sex imagery involving non consenting people it is not even remotely as bad as actual rape and distributing real photos.

[–] [email protected] 41 points 6 days ago* (last edited 6 days ago)

I don't think you're on the right track here. There are definitely existing laws in most states regarding 'revenge porn', creating sexual media of minors, Photoshop porn, all kinds of things that are very similar to ai generated deep fakes. In some cases ai deepfakes fall under existing laws, but often they don't. Or, because of how the law is written, they exist in a legal grey area that will be argued in the courts for years.

Nowhere is anyone suggesting that making deepfakes should be prosecuted as rape, that's just complete nonsense. The question is, where do new laws need to be written, or laws need to be updated to make sure ai porn is treated the same as other forms of illegal use of someone's likeness to make porn.

[–] [email protected] 6 points 5 days ago

I'm some jurisdictions, public urination can put you on a sex offender registry.

It wouldn't even matter if you're trying to be discreet and just have to go but there's no public washrooms around.

load more comments (44 replies)
[–] [email protected] 21 points 6 days ago (1 children)

Title is misleading?

An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images.

Classes are planned to resume on Tuesday, Lancaster Online reported.

So the school is still in operation.

[–] [email protected] 8 points 6 days ago* (last edited 6 days ago)

Shut down for one day at least.

[–] [email protected] 5 points 6 days ago (2 children)

Arstechnica doesn't cite its sources? All it has are links to more Arstechnica articles.

The above article says,

In the US, the feds are currently testing whether existing laws protecting kids against abuse are enough to shield kids from AI harms.

..but doesn't cite any sources. There's an embedded link, back to Arstechnica. What the fuck?

[–] [email protected] 3 points 6 days ago

In the article refers to "cops" and "feds". The overall tone of the writing sounds like a high school student wrote it.

load more comments
view more: next ›