this post was submitted on 03 Jul 2024
26 points (100.0% liked)

Technology

37525 readers
290 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 18 points 1 month ago (20 children)

generative AI makes it very easy for anyone to flood the internet with generated text, audio, images, and videos.

And? There's already way too much data online to read or watch all of it. We could just move to a "watermark" system where everyone takes credit for their contributions. Things without watermarks could just be dismissed, since they have as much authority as an anonymous comment.

[–] [email protected] 12 points 1 month ago (2 children)

I am waiting for people to start getting both public and hidden authentication tattoos, so they can prove generative images aren't actually them.

[–] [email protected] 3 points 1 month ago (1 children)

How would that work?

AIs learn from existing images, they could just as well learn to reproduce a tattoo and link the pattern to a person's name. Recreating it from different angles, would require more training data, but ultimately would get there.

[–] [email protected] 4 points 1 month ago

For public ones, depending on what people started getting, it'd really strain the AIs. You could go in like 1 or two ways, probably different people getting both.

Something very uniform but still unique, like a QR code kind of deal, AIs would hallucinate the crap out of that. Or abstractions, like people do to change the way the shape of their face to combat facial recognition.

For private ones, just don't ever get it photographed, any image showing that area without it would be probably fake.

load more comments (17 replies)