this post was submitted on 09 Jul 2024
191 points (93.6% liked)

A Boring Dystopia

9403 readers
775 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 1 year ago
MODERATORS
top 48 comments
sorted by: hot top controversial new old
[–] [email protected] 166 points 1 month ago (3 children)

China has a long history of siding with vehicles running people over.

[–] [email protected] 10 points 1 month ago

Not sure if you're referring to tianmen or the fun Chinese practice of double tap

[–] [email protected] 3 points 1 month ago

This never happened

  • PRC
[–] [email protected] 38 points 1 month ago (1 children)

Person crosses street when they shouldn't.

Car lightly taps them and stops.

Person is not injured.

Person is stupid.

I think regulation is important, but this isn't news.

[–] [email protected] -2 points 1 month ago (1 children)

Yeah. The person was on 'FA' and now (barely) on 'FO'.

What happened to looking both ways and being wary of the things that could crush you?

[–] [email protected] 3 points 1 month ago (1 children)

What do those letters mean?

[–] [email protected] 3 points 1 month ago

Fuck around - find out

[–] [email protected] 31 points 1 month ago (4 children)

Whether or not to run over the pedestrian is a pretty complex situation.

[–] [email protected] 36 points 1 month ago

what was the social credit score of the pedestrian?

[–] [email protected] 6 points 1 month ago

To be fair, you have to have a pretty high IQ to run over a pedestrian

[–] [email protected] 5 points 1 month ago (1 children)

Right?

I saw "in a complex situation" and thought "what's complex? Person in road = stop"

[–] [email protected] 4 points 1 month ago* (last edited 1 month ago) (2 children)

Well yes and no.

First off, ignoring the pitfalls of AI:
There is the issue at the core of the Trolley problem. Do you preserve the life of a loved one or several strangers?

This translates to: if you know the options when you're driving are:

  1. Drive over a cliff / into a semi / other guaranteed lethal thing for you and everyone in the car.
  2. Hit a stranger but you won't die.

What do you choose as a person?

Then, we have the issue of how to program a self diving car on that same problem. Does it value all life equally, or is it weighted to save the life of the immediate customer over all others?

Lastly, and really the likely core problem, is that modern AI aren't capable of full self driving, and the current core architecture will always have a knowledge gap, regardless of the size of the model. They can, 99% of the time, only do things that are in their data models. So if they don't recognize a human or obstacle, in all of the myriad forms we can take and move as, they will ignore it. The remaining 1% is hallucinations that end up being randomly beneficial. But, particularly for driving, if it's not in the model they can't do it.

[–] [email protected] 8 points 1 month ago (1 children)

We are not talking about a "what if" situation where it has to make a moral choice. We aren't talking about a car that decided to hit a person instead of a crowd. Unless this vehicle had no brakes, it doesn't matter.

It's a simple "if person, then stop" not "if person, stop unless the light is green"

A normal, rational human doesn't need a complex algorithm to decide to stop if little Stacy runs into the road after a ball at a zebra/crosswalk/intersection.

The ONLY consideration is "did they have enough time/space to avoid hitting the person"

[–] [email protected] 5 points 1 month ago* (last edited 1 month ago)

The problem is:
Define person.

A normal rational person does have a complex algorithm for stopping in that situation. Trick is that the calculation is subconscious, so we don't think it is complex.

Hell even just recognizing a human is so complex we have problems with it. It's why we can see faces in inanimate objects, and also why the uncanny valley is a thing.

I agree that stopping for people is of the utmost importance. Cars exist for transportation, and roads exist to move people, not cars. The problem is that from a software pov, ensuring you can define a person 100% of the time is still a post- doctorate research level issue. Self driving cars are not ready for open use yet, and anyone saying they are is either delusional or lying.

[–] [email protected] 0 points 1 month ago

Just a lil ml posting

[–] [email protected] 30 points 1 month ago (1 children)

Isn’t China the place where they make sure your dead when they hit you? Backing up running over you multiple times.

[–] [email protected] 6 points 1 month ago

It's not that bad anymore.

[–] [email protected] 26 points 1 month ago (2 children)

Why is social media options any factor in this discussion?

[–] [email protected] 4 points 1 month ago (3 children)

It's beneficial to know what the general public thinks about issues?

[–] [email protected] 6 points 1 month ago

I don't think "posts on social media" is a good indicator for what the public thinks anymore, if ever. The amount and reach of bot or bought accounts are disturbingly high.

[–] [email protected] 3 points 1 month ago (1 children)

Social media aren't "the general public"

[–] [email protected] 0 points 1 month ago (1 children)

Do you have a better way of interviewing Chinese Nationals for Western media?

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

With the terrible demographic distribution, the absolute sewage social media is and the bots that make more than half the content. If you want to know what the general public thinks, you could not chose any worse source

[–] [email protected] 1 points 1 month ago

...they asked on social media.

[–] [email protected] 23 points 1 month ago* (last edited 1 month ago) (1 children)

Reading the comments I get the impression that most people didn't actually read the article, which says that a woman was barely touched and not injured by a self-driving car while crossing the street with a red light.

There barely is "news" here, as the car correctly halted as soon as possible after noticing the pedestrian unforeseeable move, so let alone sides to take.

I am perfectly aware that self-driving technology still has numerous problems corroborated by the incidents reported from time to time, but if anything this article seems a proof that these cars will at least not crush to death the first pedestrian that does a funky move.

[–] [email protected] 2 points 1 month ago

Why read the article when I can repeat lies and make tired jokes about social credit scores because China bad?

[–] [email protected] 17 points 1 month ago* (last edited 1 month ago)

The car has perfect social credit, the 'human' failed to yield at a crosswalk once in 2003.

You do the math, idiotic westerners suck with our superior inverse-logic.

[–] [email protected] 6 points 1 month ago (1 children)

Switch the word car with the gun... And it would be america!

[–] [email protected] 1 points 1 month ago
[–] [email protected] -3 points 1 month ago* (last edited 1 month ago) (3 children)

How does "driverless cars hitting people is so incredibly rare that a single instance of it immediately becomes international news" at all signify "boring dystopia"? If anything we should be ecstatic that the technology to eliminate the vast majority of car deaths is so close and seems to be working so well.

Don't let perfect be the enemy of ridiculously, insanely amazing.

[–] [email protected] 5 points 1 month ago* (last edited 1 month ago)

I think the hook of the story is people backing the not alive and not conscious vehicle instead of the injured, alive and conscious human.

I'm all for automating transportation, but if the importance of that convenience outweighs our ability to empathize, we're in for a real sad century.

[–] [email protected] 5 points 1 month ago (2 children)

Yeah that was my thought too... driverless cars don't need to never fuck up, they need to fuck up less than humans do. And we fuck up a LOT.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago) (1 children)

I'd argue they need to fuck up less than the alternative means of transport that we could be transitioning to if we weren't so dead-set on being car dependent. So dead-set, in fact, that we are allowing ourselves to be made complacent; by billion-dollar companies that peddle entirely new technology to excuse the death and destruction to our environment and social fabric that they've wrought upon us and continue to perpetuate; instead of us demanding new iterations of the old, safer, more affordable, more efficient, but unfortunately less profitable tech that our country sold out to those same monied interests for them to dismantle.

[–] [email protected] 1 points 1 month ago

I mean, I'm on board with the fuck-cars reasoning, but also recognize that we'll never make it happen except by our own extinction. And we're speedrunning that shit. Let's take whatever improvements we can realistically get, be it cars or whatever else, and hit what's left of Earth's ability to support life as comfortably as possible. If that includes running over fewer people by using R2D2 to cart us around vs our own monkey brains... cool! If it's something better, extra cool! I'll take progress wherever I can get it.

[–] [email protected] 1 points 1 month ago (1 children)

Exactly. As early as the technology still is, it seems like it's already orders of magnitude better than human drivers.

I guess the arbitrary/unfeeling impression of driverless car deaths bothers people more than the "it was just an accident" impression of human-caused deaths. Personally, as long as driverless car deaths are significantly rarer than human-caused deaths (and it already seems like they are much, much rarer), I'd rather take the lower chance of dying in a car accident, but that's just me.

[–] [email protected] 3 points 1 month ago (1 children)

I think the problem right now is that driverless cars are still way worse than human drivers in a lot of edge cases. And buffalo buffalo buffalo when you have so many people driving every day you end up with a lot of edge cases.

[–] [email protected] 1 points 1 month ago

That's probably true, but their handling of edge cases will only get better the more time they spend on the roads, and it already looks like they're significantly safer than humans under normal circumstances, which make up the vast majority of the time spent on the road.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago) (2 children)

I see you're not familiar with the trend of autonomous vehicles hitting pedestrians and parked cars. ~~They've been completely banned~~ They were suspended from San Francisco after many, many incidents. So far their track is inferior to humans (see Tesla Autopilot, Waymo, and Cruise), so you don't need to worry about perfect.

[–] [email protected] 2 points 1 month ago (1 children)

As someone who was literally just in San Fran, the driverless cars are not only a thing, but they're booked out days in advance so idk where you're getting your info from

[–] [email protected] 0 points 1 month ago (2 children)
[–] [email protected] 2 points 1 month ago

I was in 2 of them in Phoenix. June 18th/19th est time. Took one home from the bar, and one to go pick up the rental car from the bar to drop off at the air port.

[–] [email protected] 1 points 1 month ago

Here's a picture of one driving around a couple weeks ago

Self Driving in San Fran

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago) (1 children)

In December, Waymo safety dataβ€”based on 7.1 million miles of driverless operationsβ€”showed that human drivers are four to seven times more likely to cause injuries than Waymo cars.

From your first article.

Cruise, which is a subsidiary of General Motors, says that its safety record "over five million miles" is better in comparison to human drivers.

From your second.

Your third article doesn't provide any numbers, but it's not about fully autonomous vehicles anyway.

In short, if you're going to claim that their track record is actually worse than humans, you need to provide some actual evidence.

Edit: Here's a recent New Scientist article claiming that driverless cars "generally demonstrate better safety than human drivers in most scenarios" even though they perform worse in turns, for example.

[–] [email protected] -2 points 1 month ago (1 children)

If you just look a pure numbers, sure, you can make it sound good. When you go look at the types of accidents, it's pretty damning. Waymo and Cruise both have a history of hitting parked cars and emergency vehicles. Tesla Autopilot is notorious for accelerating at the back of parked emergency vehicles.

The issue is not the overall track record on safety but how AV accidents almost always involve doing something incredibly stupid that any competent, healthy person would not.

I'm not personally against self driving cars once they're actually as competent as a human in determining their surroundings, but we're not there yet.

[–] [email protected] 3 points 1 month ago

The issue is not the overall track record on safety but how AV accidents almost always involve doing something incredibly stupid that any competent, healthy person would not.

As long as the overall number of injuries/deaths is lower for autonomous vehicles (and as you've acknowledged, that does seem to be what the data shows), I don't care how "stupid" autonomous vehicles' accidents are. Not to mention that their safety records will only improve as they get more time on the roads.