this post was submitted on 30 Oct 2024
508 points (88.7% liked)

Technology

58981 readers
4227 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we're going to allow Elon to murder every year?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 24 points 15 hours ago (3 children)

Is there video that actually shows it "keeps going"? The way that video loops I know I can't tell what happens immediately after.

[–] [email protected] 5 points 10 hours ago (1 children)

The driver's tweet says it kept going, but I didn't find the full video.

[–] [email protected] 2 points 10 hours ago (1 children)

Inb4 it actually stopped with hazards like I've seen in other videos. Fuck elon and fuck teslas marketing of self driving but I've seen people reach far for karma hate posts on tesla sooooooo ¯\_(ツ)_/¯

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 1 points 7 hours ago

"there was no Danger to my Chasis"

[–] [email protected] 19 points 15 hours ago (1 children)

the deer is not blameless. those bastards will race you to try and cross in front of you.

[–] [email protected] 15 points 14 hours ago (2 children)

Finally someone else familiar with the most deadly animal in North America.

[–] [email protected] 13 points 13 hours ago

I'd give the moose the top spot. Maybe not in sheer numbers of deaths, but I'd much rather have an encounter with a deer than a moose.

Though for sheer number, I also wouldn't give that to deer, that spot would go to humans, though I can admit it's a bit pedantic.

[–] [email protected] 4 points 14 hours ago

yeah well ive hit about $15k worth of them over the years

[–] [email protected] 203 points 21 hours ago (54 children)

The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.

How are these people always such pathetic suckers.

[–] [email protected] 8 points 11 hours ago

I’d go even farther and say most driving is an edge case. I used 30 day trial of full self-driving and the results were eye opening. Not how it did: it was pretty much as expected, but looking at where it went wrong.

Full self driving did very well in “normal” cases, but I never realized just how much of driving was an “edge” case. Lane markers faded? No road edge but the ditch? Construction? Pothole? Debris? Other car does something they shouldn’t have? Traffic lights not aligned in front of you so it’s not clear what lane? Intersection not aligned so you can’t just go straight across? People intruding? Contradictory signs? Signs covered by tree branches? No sight line when turning?

After that experiment, it seems like “edge” cases are more common than “normal” cases when driving. Humans just handle it without thinking about it, but the car needs more work here

[–] [email protected] 129 points 21 hours ago (2 children)

I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.

[–] [email protected] 48 points 21 hours ago (3 children)

Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.

Deer are the opposite of an edge case in the majority of the US.

[–] [email protected] 19 points 19 hours ago* (last edited 19 hours ago) (1 children)

Putting these valid points aside we're also all just taking for granted that the software would have properly identified a human under the same circumstances..... This could very easily have been a much more chilling outcome

load more comments (1 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 31 points 20 hours ago

Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a "snowflake liberal" by comparison

load more comments (51 replies)
[–] [email protected] 22 points 16 hours ago (3 children)

For the 1000th time Tesla: don't call it "autopilot" when it's nothing more than a cruise control that needs constant attention.

[–] [email protected] 10 points 15 hours ago

It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn't have to take responsibility when it inevitably breaks the law.

load more comments (2 replies)
[–] [email protected] 36 points 18 hours ago (6 children)

Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.

The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

[–] [email protected] 6 points 14 hours ago (1 children)

Humans are also bad drivers who get edge cases wrong all the time.

It would be so awesome if humans only got the edge cases wrong.

[–] [email protected] 3 points 10 hours ago

I've been able to get demos of autopilot in one of my friend's cars, and I'll always remember autopilot correctly stopping at a red light, followed by someone in the next lane over blowing right through it several seconds later at full speed.

Unfortunately "better than the worst human driver" is a bar we passed a long time ago. From recent demos I'd say we're getting close to the "average driver", at least for clear visibility conditions, but I don't think even that's enough to have actually driverless cars driving around.

There were over 9M car crashes with almost 40k deaths in the US in 2020, and that would be insane to just decide that's acceptable for self driving cars as well. No company is going to want that blood on their hands.

[–] [email protected] 20 points 18 hours ago (2 children)

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

https://www.reuters.com/business/autos-transportation/nhtsa-opens-probe-into-24-mln-tesla-vehicles-over-full-self-driving-collisions-2024-10-18/

The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.

It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.

[–] [email protected] 11 points 16 hours ago (1 children)

It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.

I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn't you be shoving that into every single selling point you have? Why wouldn't that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla's FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?

[–] [email protected] 4 points 13 hours ago

If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.

Bu they don't because they know it would be a repeat of smashing the bulletproof window on stage.

load more comments (1 replies)
load more comments (4 replies)
[–] [email protected] 29 points 18 hours ago (3 children)

The autopilot knows deers can't sue

[–] [email protected] 5 points 14 hours ago

What if it kills the deer out of season?

load more comments (2 replies)
[–] [email protected] 21 points 17 hours ago (1 children)

Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

[–] [email protected] 11 points 17 hours ago* (last edited 16 hours ago) (1 children)

I mean, to be honest...if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.

Official advice I heard many times. Prolly doesn't apply if you are going slow.

Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I've never had to put it i to test.

[–] [email protected] 6 points 13 hours ago

Haven't read down yet, but I bet odds are a bit better if you let go of the brake just before impact, to raise the front up a bit.

[–] [email protected] 10 points 15 hours ago (2 children)

It doesn't have to not kill people to be an improvement, it just has to kill less people than people do

[–] [email protected] 5 points 13 hours ago (3 children)

True in a purely logical sense, but assigning liability is a huge issue for self-driving vehicles.

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 77 points 22 hours ago* (last edited 22 hours ago) (2 children)

Only keeping the regular cameras was a genius move to hold back their full autonomy plans

[–] [email protected] 40 points 21 hours ago (2 children)

The day he said that "ReGULAr CAmErAs aRe ALl YoU NeEd" was the day I lost all trust in their implementation. And I'm someone who's completely ready to turn over all my driving to an autopilot lol

load more comments (2 replies)
load more comments (1 replies)
load more comments
view more: ‹ prev next ›