The only reason it hasn't been updated is because the trucking industry has fought it tooth and nail.
They want to use public infrastructure but can't "contribute" back by at least making their trucks not insanely deadly to everyone else. Those videos show a 35mph crash is basically a recipe for decapitating front passengers of a sedan.
The distain truckers and the trucking industry have for the general public is disgusting. I've seen lots of trucks with giant spiked lugnuts - the only purpose of which is to shred the side of a passenger car if the trucker makes contact with said car. Literally weaponizing their vehicles.
It looks like it may have stayed in place and sliced through the car. In most other cars, it'd hit a dense engine before passengers; in a Tesla, that space is a trunk.
Engines are not crash structures in gasoline vehicles. It’s the frame rails that are designed to soften the impact. Having more empty space in front of the passengers actually makes it easier to dissipate energy over a longer period of distance/time before parts start to intrude on the passenger compartment.
I was initially a bit confused as to how the crash occurred in a parking lot while the couple traveled southbound on an Interstate, but then noticed it's a parking lot of a rest stop, ie just off the side of the highway. I think this is it here:
And here is where the number 3 lane splits into two lanes. Definitely looks like the lane follow assist decided to keep right which put them in the exit lane for the rest stop.
I agree with your location contender: Google street view has recent pass in May 2022. I believe the two small, triangular lighter colored trees next to the two taller darker ones with exposed trunkes visible in the article photo are visible from here: https://goo.gl/maps/yXNWpyqy4RgpUChC7
Because the "two triangluar trees" in the photo are actually three trees (observe them from further north on southbound lane), we can line up the direction the photo was taken from almost exactly. The photo must almost certainly be on a line from the triangluar trees through here: https://goo.gl/maps/snxfWCeMXeKoNdWJ9
(Google street view from the nearby of the southbound rest stop/lot itself -ie not from the southbound lane- is dated from 2011 so the trees are different)
My own speculation:
If that is the case, and given the history of "autopilot" being confused about white lines indicating that the car must surely be in the lane (eg driving right into wedges at offramps), It's possible the 'double offramp' design of the southbound parking lot/rest stop here would be particularly challenging for an automated system.
Is this the same one from the other day? As I understood it there wasn't any talk of autopilot being involved, beyond the obvious involvement of a Tesla? e.g. at the present time is there anything beyond brand that distinguishes this crash from someone crashing a ford or some such?
Okay, I'm confused by this reporting. As far as I can tell the article posted got its information from a Daily Mail article which in turn doesn't cite a source. The related AP bulletin only says:
> The National Highway Traffic Safety Administration confirmed Friday that it sent a Special Crash Investigations team to probe the Wednesday crash into the back of a semi-trailer at a rest area near Gainesville.
The reason I find the article confusing is because, as mentioned in the article, the NHTSA already has an open investigation, moving towards a recall, into Tesla crashes related to autopilot.
That investigation cited 37 cases where it appears the cars malfunctioned, both not prompting the driver and not taking preventative action like braking until it was too late to avoid the crash, and in some cases not at all.
If you weren't following that story, a quick summary:
- The government received 11 reports of Teslas striking first responders with autopilot engaged.
- This prompted them to investigate all reports they had received about Tesla crashes involving autopilot, regardless of if a first responder was involved or not.
- From the hundreds of crashes where this has happened, they went through and eliminated cases where something other than the technology played a role. This included things like the driver using the system in environments where it's not supported or the driver not reacting to vehicle prompts or the other driver being partially at fault or inclement weather or just not having enough data to draw a definitive conclusion.
- After ruling out every case with these other factors in play they ended up with 37 cases that appeared to be vehicle malfunctions. Cases on the highway, with the system operating as intended, in clear weather conditions, with an apparently attentive driver, with other traffic behaving normally, where the car still crashed, did not take action to prevent the crash, and did not prompt the driver until it was too late for them to take evasion action (in some cases never prompting them at all).
It doesn't seem like this case in Florida has been added to that investigation yet.
So what is this 37 number that they're reporting?
Did they mean to say, "this could be similar to the other 37 cases" or by total coincidence is it referencing some other group of 37 crashes that the NHTSA is investigating?
I'm confused but suspecting that the posted news article is just wrong.
Somewhere (presumably the Daily Mail) a journalist searched to see how many times a crash like this has happened and the search for something like "NHTSA investigates tesla crash how many times" returned the ongoing investigation with 37 confirmed cases.
Because to be clear: the NHTSA has investigated almost 400 Tesla autopilot crashes similar to this one and this one (as far as I can tell) has not been added to the engineering analysis they have open that has 37 other cases included in it.
Of course if there's something I'm missing here I'd love for someone to clear it up.
They will write absolute fiction and publish it as fact.
For instance, they prepared two articles for the outcome of a high profile murder case, one for each verdict. Then they published the wrong one temporarily when the verdict was announced. Anybody can make a mistake though, right? Well, in this case, the wrong article included quotes about the guilty verdict (that didn’t happen) and detailed descriptions of how people reacted to the verdict. These were all completely made up before the verdict had even been announced.
Might surprise you to learn most large publications do this and will use placeholder (or they're supposed to be at least) quotes. Not going to defend Daily Mail though, they're absolute trash and deserve the criticism. Of all the content I've encountered it was very, very clear what was placeholder, even if it was published you'd know something was off. I've also seen a publication that had safeguards against accidental publication if there was any placeholder copy in the content.
> Highway Patrol Lt PV Riordan told the portal that the authorities are looking into whether the automated features were involved in the crash. The report further suggests that previous autopilot-related crashes were linked to the Teslas' cameras being confused by flashing lights and reflectors on stopped emergency vehicles
A bit disingenuous to say that they dont know if the autopilot or any automated feature was involved AT ALL, and then proceed to insinuate that teslas cameras and autopilot are so bad, they might as well be responsible.
The prior history is a rationale for why one might investigate whether it is involved.
Most news stories will mention any prior notable incidents impacting the subject they’re talking about even if it is not yet determined if they are connected.
> A bit disingenuous to say that they dont know if the autopilot or any automated feature was involved AT ALL, and then proceed to insinuate that teslas cameras and autopilot are so bad, they might as well be responsible.
Oddly, NHTSA and others have noted that Fool Self Driving will cut itself off in case of impact or an accident, sometimes only a second before impact. Tesla can also kick drivers off of FSD Beta for opaque reasons, including for being a journalist. Several experts and users have noted the many flaws and errors with FSD.
Weird.
I for one don't appreciate being part of their testing program on public roads with drivers who take "Full Self Driving" at name value.
I'm not going to claim to know how weird driver error can be - but if you look at the image of the crash scene it is extremely unusual. Would require very rapid accelerating in a parking lot and very high speeds.
I dabble in image recognition. I could understand the signage on the back of the truck looked like an overhead sign much further away. This is the sort of whoops, sorry accidents ML will make. I'd still like to see a deaths by people driving normal cars/normal cars on the road vs. deaths in automatic cars/automatic cars on the road, by the hour on the road. If it's less than real drivers then I say charge on.
> deaths by people driving normal cars/normal cars on the road vs. deaths in automatic cars/automatic cars on the road, by the hour on the road
These are not qualitatively the same. Drivers have a lot of control over their individual accident rate, by driving more dangerously or more safely. I want to know if the automatic system is safer than me, not just safer than the average.
And to state the obvious: the appropriate regulatory reaction to a death caused by AI is not "charge on" but investigation and improvement.
Since you may be on the reciving end of a self-driving-system gone awry, having them simply be better than average would be a good thing. Especially if we require them to always be better than average as they begin to proliferate.
> I'd still like to see a deaths by people driving normal cars/normal cars on the road vs. deaths in automatic cars/automatic cars on the road, by the hour on the road
Since "deaths using autopilot" is a statistical bucket unique to Tesla no such comparison can be made.
We have no equivalent death numbers for "normal cars" operated either 100% manually under autopilot-appropriate conditions, or primitive cruise-control. If you just make the comparison against automotive fatalities in general, it's a bullshit comparison, since it would include all driving conditions.
>I'd still like to see a deaths by people driving normal cars/normal cars on the road vs. deaths in automatic cars/automatic cars on the road, by the hour on the road
This is not enough, you will need to also make sure you consider:
- type of car , what safety systems have (maybe compared with a car with intelligent safety system the stats will look completly different, maybe even worse for Tesla)
- age of the car, probably will correlate a lot with the above,
- the road , we know Tesla will fail to drive on hard road section so would be fair to make comparisons on the same easy roads sections.
- IMPORTANT: you need to count the cases where a Tesla human driver intervene and saved the day, this incidents won't appear as crashes but they would have been crashes if the human was not there, you would need to ask Elon to be transparent with the publish and publish this, from what we see in YouTube videos the human driver had to intervene a few times in an hour to save the situation from critical situations
- optionally produce a statistic where you exclude drunk drivers, people on phone, people that got asleep , this cases could be fixed with cheaper tech so I would be curious if the cheaper tech would be safer but is just not sexy enough (or in fact as I suspect self driving is not about more safety and just having a private driver, the safety is just an inconvenient
, without it Elon would have removed the Beta label from the FSD
More sensor input, especially at different frequencies which are going to have different characteristics and so different capabilities and limitations regarding what they detect, is certainly going to provide more data. Since LIDAR is on the order of $1000 per vehicle now, it shouldn't really be a cost concern, but the engineering to incorporate and properly merge that additional sensor input is not insignificant. Tesla has certainly been aggressive in its "FSD" market push, and the quantity of accidents absolutely should be a concern. If the NHTSA determines that the system as-is is sufficient (that is, at least better than the average), then they should say so - but the reverse holds true too.
Or really any kind of radar where you fallback to, if there's something detected we should probably stop. A lot of Tesla folk will go on about radar causing brake checking, but I have a vision only Y and get it just as much as I did on my radar equipped 3.
It’s per distance and not per time, and number of crashes and not number of deaths, and the roads where autopilot can be enabled are usually safer and easier, but it is indeed safer: https://www.tesla.com/VehicleSafetyReport
The issue with comparing these numbers is that the distribution of quality and safety of roads is not the the same for data collected by Tesla vs. "human drivers".
It is easy to claim that in relatively safe environment of a interstates and highways Teslas did better than human drivers everywhere else per 1M km.
TL;DR; even if this is truly your only metric to judge these systems, you can not simply compare these numbers.
Why would it be? You can’t let a small handful of deaths grind progress to a halt especially when the alternative is the hundreds of thousands of deaths we see on the roads today. Take precautions, learn from mistakes but don’t shut everything down.
Like the "I dabble in image processing" lead in (which like saying 'I've been known to brush my teeth' then launching into a discussion of removing the thyroid via arthroscopic surgery through the lip)
Then the tone of the rest of the comment, so light and cheery with "whoops sorry!"
And then the implication that a system where a classifier that confuses a truck for an overhead sign killing people is ok* as long as cherry picked stats check out (*for those of us in the space without arbitrary restrictions on our sensor suits because a shyster tweeted "we only need cameras) that wouldn't
-
I work in the AV space and despite staking our livelihoods on this problem space yet haven't managed to pick up such a cavalier attitude.
Human lives are not fungible. If I gave you a button that saved 8 random lives and killed 4 others would you press it?
Except, Tesla removed radar from their cars, a relatively inexpensive and simple mechanism that significantly reduce their fatality rate.
It’s a false dichotomy here it’s frustrating, especially when they had implemented a check to their software and then removed it to save money on a $50,000 car
Yeah I think that’s a huge mistake. I would have hoped Elon had enough foresight to see that volume production would make LIDAR cheaper. He has painted himself into a corner. However, he seems like he could play off changing his mind on that decision pretty easily so I hope they go that route at some point. Only problem is it would make a lot of the Tesla’s sold so far incapable of doing FSD so I don’t see that happening anytime soon.
If I had to guess and be mean, even though I like Elon Musk, I'd say that was one of his directives; I liked watching Elon talk about rocket technology, but one day I stumbled about his ideas about software development at PayPall and it made me cringe.
This kind of point comes up a lot in these discussions. I'd suggest a slight alternative:
"Death per million person kilometers, broken down by car age and road type."
The expectation of death in modern cars during highway driving should be very low. Given that tools like Autopilot are only installed on new cars and only enabled in highway driving risks spurious comparisons if all miles of driving are treated equally...
An article from 2018 [1] attempted an even stricter apples-to-apples comparison. The author’s estimates of fatality rates for Autopilot-driven Teslas vs. human-driven Teslas suggested that Autopilot was more deadly than human drivers but still the same order-of-magnitude.
The author also called Musk’s comparison of Autopilot’s fatality rate to NHTSA’s fatality rate an “apples-to-aardvarks” comparison because NHTSA’s statistic includes bicycles, pedestrians, motorcycles, and buses.
Here's two major situations that the NHTSA was involved in that affected Ford, Toyota, Honda, and other non-Tesla companies
----
"On March 6, 2000, NHTSA began a preliminary inquiry and on May 2, NHTSA began an investigation (PE00-020) concerning the high incidence of tire failures and accidents of Ford Explorers and other light trucks and SUVs fitted with Firestone Radial ATX, ATX II, and Wilderness tires"
"On June 23, 2014, auto manufacturers BMW, Chrysler, Ford, Honda, Mazda, Nissan, and Toyota announced they were recalling over three million vehicles worldwide due to Takata Corporation-made airbags. The reason was that they could rupture and send debris flying inside the vehicle. This was in response to a US National Highway Traffic Safety Administration (NHTSA) investigation that was initiated after the NHTSA received three injury complaints."
-----
Just because this is a current investigation and Elon Musk likes to play the victim while being one of the richest people in the world, doesn't mean there's some bias against Tesla.
Oh, that was huge. Airbags are normally powered by sodium azide. But, to reduce costs, Takata tried to use ordinary nitrate-based explosives, which are less stable. That did not end well.
I don't know the details other than it has to do with the long term climate the car was in.
I posted because many people now simply claim bias or "it's political" whenever they are investigated by a government organization. If we allow that to happen then it's a free pass for corruption to flourish.
Lots of cars have been in the news when it is suspected that defects have caused crashes. Pick a manufacturer and they’ve probably had a model marred by safety scandal.
100,000 regular cars didn't make the decision to kill their occupants though.
Drunk drivers kill people so we try to prevent drunk driving, If cars are occasionally killing people due to a design flaw they should be investigated.
Don't worry though - Tesla will say that autopilot deactivated 10ms before impact and that it's all the drivers fault and people will go back to ignoring it
If drivers make deadly mistakes at X rate, and software makes deadly mistakes at Y rate where Y < X, should we recall the software? What if Y is substantially less than X but still not zero?
The complication with that is that software makes different mistakes than drivers do. And sometimes, software refuses to drive, in which case, their accident rates are subject to selection bias. For cars which can be operated by software or a driver, you’d want to compare for an exact equivalent scenario.
The problem is Y is more uniform but X is not. Good drivers make significantly fewer mistakes than the average. For these people, using software potentially increases risk by order of magnitudes.
The autopilot may deactivate itself in such a situation, it tends to not like uncertainty, but as far as I know everyone will still consider that a crash under autopilot.
At 37 investigations, shouldn't we have these things pulled off the roads?
At minimum, we should be demanding a update disabling Autopilot and FSD entirely to the entire fleet until the company's relatively rogue practices can be brought under control.
> At 37 investigations, shouldn't we have these things pulled off the roads?
Just for comparison sake, it would be interesting to know how many safety investigation other makes have had in a similar timeframe (I don't know where to find such numbers). It's hard to know if 37 is a lot, or standard
I understand lot of Elon fans here , who think he’s Jesus… but even 1 death or accident due to a bad design or bug should have already forced Tesla into bankruptcy… you cannot beta test with peoples life .. proclaim its autopilot and safest system and then when someone dies because of it just explain it away what if comparison.. I have a Tesla and this exact scenario happened with me instead of being on the freeway it just took parallel exit road and was almost going to crash but I took over control very last second and avoided a horrible crash … I also reported it in NHTSA… so I have very little tolerance to anyone who wants to suck off Elon.. tell me one car company which had faulty software/hardware which are still running and are not forced to recall or fix the parts.. so stop will bullshit comparison of every car being pulled off road … Elon has been playing fast and loose with laws and I hope it catches up to him
Yesteryesteryesterday, 53 comments https://news.ycombinator.com/item?id=32023337