this post was submitted on 30 Apr 2024
250 points (88.8% liked)

Privacy

31186 readers
558 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

According to software engineer and blogger, Paul Biggar, however, one key detail on the methods employed by the Lavender system that is often overlooked is the involvement of the messaging platform, WhatsApp. A major determining factor of the system’s identification is simply if an individual is in a WhatsApp group containing another suspected militant.

Aside from the inaccuracy of the method and the moral question of targeting Palestinians based on shared WhatsApp groups or social media connections, there is also notably the doubt it brings to the platform being privacy-based and guaranteeing “end-to-end” encryption for messages.

Stating that WhatsApp’s parent company, Meta, makes it complicit in Israel’s killing of “pre-crime” suspects in Gaza, Biggar accused the company of directly violating international humanitarian law, as well as its own public commitment to human rights.

These revelations are the latest evidence of Meta – formerly Facebook – aiding in the suppression of Palestinian and pro-Palestinian voices, with the platform long having been criticised for taking significant steps to shut down dissent against Israeli and Zionist narratives. Those measures have included permitting adverts promoting a holocaust against Palestinians and even attempting to flag the word ‘Zionist’ as hate speech.

Questioning the accuracy of the report, a WhatsApp spokesperson told MEMO: “We have no information that these reports are accurate. WhatsApp has no backdoors and we do not provide bulk information to any government. For over a decade, Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested. Our principles are firm – we carefully review, validate and respond to law enforcement requests based on applicable law and consistent with internationally recognized standards, including human rights.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 62 points 4 months ago (1 children)

They really don't provide enough to back up the insane claims they're making. I would take all this with a massive grain of salt as it's most likely bullshit wartime propaganda designed to stir people up.

[–] [email protected] 27 points 4 months ago (2 children)
[–] [email protected] 14 points 4 months ago (1 children)

“Mistakes were treated statistically,” a source who used Lavender told +972. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.” [...]
During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender; for senior Hamas officials, the military authorized “hundreds” of collateral civilian casualties, the report claims.

I fucking hate people. Especially those, who don't need to use violence but choose to do so anyway.

[–] [email protected] 7 points 4 months ago (1 children)

even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.

This kind of shit is why I have zero faith in the fact that AI will be used responsibly in basically any field. Mfs are already using it to avoid taking responsibility for potential war crimes ffs.

[–] [email protected] 4 points 4 months ago

Same. That's the biggest thing I'm seeing from orgs all over: using AI as a kind of "appeal to authority" to justify shitty behavior that they've been wanting to do all along. If they didn't want to act like this then they would double-check, adjust, or correct the results.

The AI gives them a headless authority to point to, saying in a way, that they were just following orders

[–] [email protected] 4 points 4 months ago* (last edited 4 months ago)

Best article (and probably the first that brought this up)

https://blog.paulbiggar.com/meta-and-lavender/

A little-discussed detail in the Lavender AI article is that Israel is killing people based on being in the same Whatsapp group [1] as a suspected militant [2]. Where are they getting this data? Is WhatsApp sharing it?