this post was submitted on 18 Jun 2024
245 points (100.0% liked)

Privacy

30859 readers
443 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

Follow-up to last week's story:

https://lemmy.ml/post/16672524

EDIT1: Politicians expect to be be exempt.

EDIT2: Good news: Vote has been postponed due to disagreements.

you are viewing a single comment's thread
view the rest of the comments
[–] BrikoX 35 points 2 months ago (1 children)

How about the false positives? You want your name permanently associated with child porn because someone fucked up and ruined your life? https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse

The whole system is so flawed that it has like 20-25% success rate.

Or how about this system being adopted for anything else? Guns? Abortion? LGBT related issues? Once something gets implemented, it's there forever and expansion is inevitable. And each subsequent government will use it for their personal agenda.

[–] [email protected] -4 points 2 months ago (2 children)

They say they the images are merely matched to pre-determined images found on the web. You're talking about a different scenario where AI detects inappropriate contents in an image.

[–] [email protected] 5 points 2 months ago (1 children)

change one pixel and suddenly it doesn'tmatch. Do the comparison based on similarity instead and now you're back to false positives

[–] [email protected] -2 points 2 months ago* (last edited 2 months ago)

My guess was that this law was going to permit something as simple as pixel matching. Honestly I don't imagine they can codify in the law something more sophisticated. Companies don't want false positives either, at the very least due to profits.

[–] [email protected] 3 points 2 months ago (1 children)

Matched using perceptual hash algorithms that have an accuracy between 20% and 40%.

[–] [email protected] 2 points 2 months ago (1 children)

Is there a source stating that they're going to require these?

[–] [email protected] 4 points 2 months ago (1 children)

Unfourtunately, I couldn't find a source stating it would be required. AFAIK it's been assumed that they would use perceptual hashes, since that's what various companies have been suggesting/presenting. Like Apple's NeuralHash, which was reverse engineered. It's also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.

Patrick Breyer's page on Chat Control has a lot of general information about the EU's proposal.

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago)

Stupid regulation, honestly. Exact matches are implementable but further than that... Aren't they basically banning e2ee at this point?

Now I see why Signal will close in EU.