this post was submitted on 10 May 2024
294 points (99.7% liked)

Not The Onion

12390 readers
956 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 6 months ago

When an autonomous weapon targets a school and kills 50 kids, who gets charged with the war crime?

When a human in a plane drops a bomb on a school full of kids, we don't charge anyone with a war crime. Why would we start charging people with war crimes when we make the plane pilotless?

The autonomy of these killer toys is always overstated. As front-line trigger pullers, they're great. But they still need an enormous support staff and deployment team and IT support. If you want to blame someone for releasing a killer robot into a crowd of civilians, its not like you have a shortage of people to indict. No different than trying to figure out who takes the blame for throwing a grenade into a movie theater. Everyone from the mission commander down to the guy who drops a Kill marker on the digital map has the potential for indictment.

But nobody is going to be indicted in a mission where the goal was to blow up a school full of children, because why would you do that? The whole point was to murder those kids.

Israelis already have an AI-powered target-to-kill system, after all.

But in 2021, the Jerusalem Post reported an intelligence official saying Israel had just won its first “AI war” – an earlier conflict with Hamas – using a number of machine learning systems to sift through data and produce targets. In the same year a book called The Human–Machine Team, which outlined a vision of AI-powered warfare, was published under a pseudonym by an author recently revealed to be the head of a key Israeli clandestine intelligence unit.

Last year, another +972 report said Israel also uses an AI system called Habsora to identify potential militant buildings and facilities to bomb. According the report, Habsora generates targets “almost automatically”, and one former intelligence officer described it as “a mass assassination factory”.

The recent +972 report also claims a third system, called Where’s Daddy?, monitors targets identified by Lavender and alerts the military when they return home, often to their family.

Literally the entire point of this system is to kill whole families.