this post was submitted on 10 May 2024
294 points (99.7% liked)

Not The Onion

12390 readers
780 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 150 points 6 months ago (13 children)

I was involved in discussions 20-some years ago when we were first exploring the idea of autonomous and semiautonomous weapons systems. The question that really brought it home to me was “When an autonomous weapon targets a school and kills 50 kids, who gets charged with the war crime? The soldier who sent the weapon in, the commander who was responsible for the op, the company who wrote the software, or the programmer who actually coded it up?” That really felt like a grounding question.

As we now know, the actual answer is “Nobody.”

[–] [email protected] 18 points 6 months ago

We don't even charge people when they blow up schools and hospitals with drone strikes now. Why would this be any different?

[–] [email protected] 15 points 6 months ago

To be fair, the answer to the question "when somebody kills a schoolbus of kids, who gets charged with a warcrime?" was always "nobody"

[–] [email protected] 13 points 6 months ago (1 children)

As we now know, the actual answer is ~~“Nobody.”~~ the 50 kids who gets designated as "terrorists" afterwards.

FTFY - it's the American way.

[–] [email protected] 4 points 6 months ago (1 children)

No they were terrorists the whole time /s

[–] [email protected] 1 points 6 months ago

They are now.

[–] [email protected] 9 points 6 months ago* (last edited 6 months ago) (1 children)

That's also a legal issue with autonomous cars.

Autonomous cars can also get into basically the trolley problem. If an accident is unavoidable, but the car can swerve and kill its own passenger to avoid killing more people in a larger wreck, should it? And would that end up as more liability for whoever takes the blame?

[–] [email protected] 2 points 6 months ago (1 children)

The owner or lesse of the car is responsible. Think of the car as a dog that bit a child.

[–] [email protected] 4 points 6 months ago (3 children)

Are we talking truly autonomous vehicles with no driver, or today's "self-driving-but-keep-your-hands-on-the-wheel" type cars?

In the case of the former, it should be absolutely the fault of the manufacturer.

load more comments (3 replies)
[–] [email protected] 8 points 6 months ago

When an autonomous weapon targets a school and kills 50 kids, who gets charged with the war crime?

When a human in a plane drops a bomb on a school full of kids, we don't charge anyone with a war crime. Why would we start charging people with war crimes when we make the plane pilotless?

The autonomy of these killer toys is always overstated. As front-line trigger pullers, they're great. But they still need an enormous support staff and deployment team and IT support. If you want to blame someone for releasing a killer robot into a crowd of civilians, its not like you have a shortage of people to indict. No different than trying to figure out who takes the blame for throwing a grenade into a movie theater. Everyone from the mission commander down to the guy who drops a Kill marker on the digital map has the potential for indictment.

But nobody is going to be indicted in a mission where the goal was to blow up a school full of children, because why would you do that? The whole point was to murder those kids.

Israelis already have an AI-powered target-to-kill system, after all.

But in 2021, the Jerusalem Post reported an intelligence official saying Israel had just won its first “AI war” – an earlier conflict with Hamas – using a number of machine learning systems to sift through data and produce targets. In the same year a book called The Human–Machine Team, which outlined a vision of AI-powered warfare, was published under a pseudonym by an author recently revealed to be the head of a key Israeli clandestine intelligence unit.

Last year, another +972 report said Israel also uses an AI system called Habsora to identify potential militant buildings and facilities to bomb. According the report, Habsora generates targets “almost automatically”, and one former intelligence officer described it as “a mass assassination factory”.

The recent +972 report also claims a third system, called Where’s Daddy?, monitors targets identified by Lavender and alerts the military when they return home, often to their family.

Literally the entire point of this system is to kill whole families.

[–] [email protected] 6 points 6 months ago

Collateral damage.

[–] [email protected] 6 points 6 months ago (4 children)

To be fair they are specifically testing AI Aimed and not fired. Firing is still up to and operator

[–] [email protected] 7 points 6 months ago* (last edited 6 months ago)

For now. The goal would obviously be to have a fully autonomous machine.

[–] [email protected] 2 points 6 months ago
[–] [email protected] 1 points 6 months ago

We already have heat seeking missiles.

[–] [email protected] 3 points 6 months ago

Yep. That's exactly like every AI system employed by the IDF.

[–] [email protected] 3 points 6 months ago* (last edited 6 months ago) (1 children)

Guns don't kill people. Autonomous robot dogs do.

[–] [email protected] 2 points 6 months ago

Do autonomous robot kill dogs fall under the second amendement?

load more comments (4 replies)
[–] [email protected] 82 points 6 months ago (1 children)

Wow! They're finally building the torment nexus from the book 'Don't Build The Torment Nexus'.

[–] [email protected] 6 points 6 months ago

I'm SO excited! The torment nexus sounds fucking awesome! I can't wait!

[–] [email protected] 33 points 6 months ago (2 children)

They know Robocop isn't a training manual, right?

[–] [email protected] 10 points 6 months ago

yeah like 1984 isn't supposed to be stroke fiction.

[–] [email protected] 3 points 6 months ago

Wait what??

[–] [email protected] 29 points 6 months ago (1 children)

This definitely can't go badly.

Can it?

No.

[–] [email protected] 6 points 6 months ago

Well, yes it can go bad. I think they forgot the self-replication mechanisms.

What? Humans as a species suck. all hail the AI overlords.

[–] [email protected] 25 points 6 months ago

Terminator theme intensifies

[–] [email protected] 15 points 6 months ago

Oh shit I forgot to turn off the kill all mode! Hey Betsy, do you know where Bobby the gun dog is? Just tell them not to move, I'm on my way. Oh they moved? Ok I'm on my way, no need to tell them but we need to tell their families at some point after this fiscal year probably. Yeah we'll see. Ok hold on, just need my approach suit of armor....

[–] [email protected] 15 points 6 months ago (1 children)

Isn't this an episode of black mirror?

[–] [email protected] 8 points 6 months ago

Yup, it sure is. S4E5 Metalhead

[–] [email protected] 13 points 6 months ago (1 children)

Get the Super Soakers ready, and fill them with saltwater!

Electronics really don't like saltwater...

[–] Kalkaline 13 points 6 months ago (1 children)

Nah, you'll want some electromagnets and Faraday cages to disable them. Salt water is too easy to protect against.

[–] [email protected] 2 points 6 months ago (1 children)

I think bullets would work fine

[–] Kalkaline 2 points 6 months ago (2 children)

This assumes your reactions are better than a machine. You need something that disables the machine passively.

[–] [email protected] 5 points 6 months ago

Like an artillery shell?

[–] [email protected] 1 points 6 months ago

Bucket of paint?

[–] [email protected] 10 points 6 months ago* (last edited 6 months ago) (1 children)

To combat criticism, the White House has announced a new line of products, the TERRIfiERS, which are live, deaf terriers carrying AI-aimed rifles. President Ivanushka has delightfully boasted about its friendliness and reduced reliance on intricate moving parts, deceasing manufacturing water emissions by 41.8%.

The TERRIfiERS will be released to civilian use for Big Hunting on April 18. It is expected that this move will increase competition among magazine manufacturers.

[–] [email protected] 2 points 6 months ago (1 children)

Whoa there, be careful! This guy sounds like he’s from the real Onion

[–] [email protected] 1 points 6 months ago (1 children)

For a blissful second you made me think the article was from the onion. Then I looked and nope, This is the world we live in

[–] [email protected] 2 points 6 months ago

Yup, and these are the hands we're given.

[–] [email protected] 8 points 6 months ago* (last edited 6 months ago)

I think the economics of sending in conscripts is always going to outweigh the benefit of sending in tech dog super soldiers.

[–] [email protected] 5 points 6 months ago (1 children)
[–] [email protected] 2 points 6 months ago

I thought about the exact same thing. Really scary!

[–] [email protected] 4 points 6 months ago

Totally unrelated video about normal robot dogs without guns on them

https://youtube.com/shorts/HZ69XsHyXzQ

[–] [email protected] 2 points 6 months ago

Oh cool a free gun delivery service

load more comments
view more: next ›