this post was submitted on 18 Oct 2024
783 points (98.5% liked)

Technology

59709 readers
1889 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 1 month ago (4 children)

Exactly. The current rate is 80 deaths per day in the US alone. Even if we had self-driving cars proven to be 10 times safer than human drivers, we’d still see 8 news articles a day about people dying because of them. Taking this as 'proof' that they’re not safe is setting an impossible standard and effectively advocating for 30,000 yearly deaths, as if it’s somehow better to be killed by a human than by a robot.

[–] [email protected] 9 points 1 month ago (1 children)

If you get killed by a robot, it simply lacks the human touch.

[–] [email protected] 7 points 1 month ago (3 children)

If you get killed by a robot, you can at least die knowing your death was the logical option and not a result of drunk driving, road rage, poor vehicle maintenance, panic, or any other of the dozens of ways humans are bad at decision-making.

[–] [email protected] 7 points 1 month ago* (last edited 1 month ago)

It doesn't even need to be logical, just statistically reasonable. You're literally a statistic anytime you interact w/ any form of AI.

[–] [email protected] 4 points 1 month ago

Or the result of cost cutting...

[–] [email protected] 2 points 1 month ago

or a flipped comparison operator, or a "//TODO test code please remove"

[–] [email protected] 1 points 1 month ago

The problem with this way of thinking is that there are solutions to eliminate accidents even without eliminating self-driving cars. By dismissing the concern you are saying nothing more than it isn't worth exploring the kinds of improvements that will save lives.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

"10 times safer than human drivers", (except during specific visually difficult conditions which we knowingly can prevent but won't because it's 10 times safer than human drivers). In software, if we have replicable conditions that cause the program to fail, we fix those, even though the bug probably won't kill anyone.

[–] [email protected] 0 points 1 month ago

But they aren't and likely never will be.

And how are we to correct for lack of safety then? With human drivers you obvious discourage dangerous driving through punishment. Who do you punish in a self driving car?