Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Which is why pilots are still in control of the plane while autopilot is active. Even while active, there is still a "pilot flying" and pilots are still responsible for scanning gauges and readouts and verifying the autopilot is doing what they expect. They do not just turn on autopilot and goof off


Just like Tesla's auto pilot? Drivers are supposed to be "flying", scanning gauges and the road ahead to ensure the autopilot is doing what they expect...

I think people that say that "autopilot" is a bad name for this feature don't really understand what an "autopilot" does.


The text at the top of the homepage for Tesla Autopilot is this:

> Full Self-Driving Hardware on All Cars

> All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

Whatever theory you have for Tesla's naming of their feature, it doesn't match with their marketing.


You are reading into that text more than you should. Autopilot and full self-driving are two separate features. Autopilot can be used today. Full self-driving can be purchased today but won't be activated until some unknown future date. Those features are separate configurable options when purchasing the car that come with their own prices. Tesla makes it clear enough that any owner should know those are two separate features. The text you highlighted is simply promising that any car purchased today can activate full self-driving down the line with no additional hardware costs.


> You are reading into that text more than you should.

The page's title is "Autopilot | Tesla". It is the first result for "tesla autopilot" in search results. And "autopilot" appears 9 times on the page. So if that's not an intentional attempt to mislead consumers into conflating Autopilot with "full self-driving", then what would such an attempt look like, hypothetically?


Is it crazy to hold a driver to a higher standard than simply Googling "Tesla autopilot" and only reading the first paragraph of the first result? If you read that entire page, the difference between autopilot and full self-driving is clear. If you read the car's manual, the difference is clear. If you look at the configurator for the car, the difference is clear when you have to pay $3,000 extra for full self-driving. I am not sure how any responsible Tesla owner could think that this is only a single feature.


>Is it crazy to hold a driver to a higher standard than simply Googling "Tesla autopilot" and only reading the first paragraph of the first result?

That is damn crazy. You should consider your users to be complete idiots when improper use of the thing can endanger lives.

Why are we even debating this?


> Is it crazy to hold a driver to a higher standard than simply Googling "Tesla autopilot" and only reading the first paragraph of the first result?

For this standard it would have to apply to every driver. Should drivers who do not google "Tesla autopilot", let alone ones that do and read on in a section about said autopilot feature, be punished with death in a two ton metal trap?


I really don't see how this is different than other features of a car like cruise control. It is up to the driver to educate themselves about cruise control. I was not part of my driver education class. There were no questions about it during the tests to get my license. I didn't learn how it worked until I was in my 20s when I first owned a car that had cruise control and I learned by reading that car's manual. I don't think anyone would have blamed the manufacturer if I killed myself because I didn't understand how cruise control worked or if I used it improperly.


Is it crazy to ask a car-making company to create 2 separate webpages when describing completely different systems?


It isn’t crazy to ask that, but I think it is crazy to view failing to create two pages as an intentional attempt to deceive or as something that absolves drivers of their own responsibilities.


Hypothetically, it might say that all Tesla cars being built today have both the hardware and software to make full self-driving possible.

It doesn’t; what it does say is perfectly clear to me.


That would ostensibly be a lie.


Yes, I know.

It remains an answer to "what would such an attempt [to intentionally mislead consumers] look like, hypothetically?"


If autopilot crashes at a lower rate than the average driver, they are correct, but autopilot+attentive driver would still be better than either alone.


The current thread is about what Tesla means by use of "autopilot". The parent commenter was telling us that Tesla only intends for it to have the same meaning as it does in aviation. My response is pointing out how Tesla seems to imply "autopilot" involves "full self-driving".


At what point is an attentive driver expected to notice that autopilot has silently failed? The video linked at the top of the thread has a <1 second interval between the car failing to follow its lane, and plowing into a concrete barrier.

This is actually harder to do then just driving the car.


An airplane under autopilot is not a second or two away from a fatal crash.


The fatal impact may not be seconds away but the event that sets in motion the series of actions that results in that fatal impact may take only seconds.


The issue is how long between the problem first manifesting itself and a crash becoming inevitable. Note that even as short a time as ten seconds is an order of magnitude longer than one second. There are at most only a few rare corner cases in aviation where proper use of the autopilot could take the airplane within a minute of an irretrievable situation that would not have occurred under manual control.


I'm not a pilot so I do not know the most dangerous situations when flying under autopilot. What I was trying to emphasize is that even under autopilot airplanes require constant attention. My understanding is that if the pilot or co-pilot leaves the cockpit for any reason the remaining pilot puts on an oxygen mask in case of decompression because the time frame before blackout is so tiny. The point is that autopilot in aviation is a tool that can be employed by pilots but cannot function safely on its own. From this viewpoint Tesla's Autopilot is accurately named although the public does not have the same understanding.


There are a lot of things in aviation that are done out of an abundance of caution (and rightly so) rather than because flights are routinely on the edge of disaster. Depressurization is not an autopilot issue, and putting on a mask is not the same as constant vigilance. Even when not using autopilot, pilots in cruise may be attending to matters like navigation and systems management that would be extremely dangerous if performed while driving.

Personally, I do not think calling Tesla's system 'autopilot' is the issue, but your claim that it is accurate is based on misunderstandings about the use of autopilots in aviation. It is not the case that their proper use puts airplanes on the edge of disaster were it not for the constant vigilance of the pilots.


If the pilots are not flying, then it can be just a short time away from a crash. Like when the pilot is not paying attention and by the time the auto pilot can no longer fly, the pilot doesn't have enough situational awareness to take over.

http://www.slate.com/blogs/the_eye/2015/06/25/air_france_fli...


That is very much an outlier, and if it were at all relevant to the issue it would further weaken your case, as these three pilots had several minutes to sort things out. Questioning the assumptions underlying the assumed safety of airplane autopilot use can only weaken the claim that Tesla's 'autopilot' is safe.


This isn't a debate about dictionary definitions, it's a debate about human behavior.

People who say that "Autopilot" is a bad name for this feature aren't basing it on an imperfect understanding of what autopilot does in airplanes. They're basing it on how they believe people in general will interpret the term.


And Tesla is relying on Joe Sixpack's Hollywood understanding of autopilot functionality. They're very well aware of that.


So you’re saying that Tesla drivers are only educated by marketing materials and ignore what the car says every time they engage the autopilot feature?


They are saying that Tesla drivers are not super humans, and only average every day, garden variety human beings...

The funny thing is that it is the same people who are arguing for self driving tech by saying that "Humans will do dumb shit", is the same ones who justify Tesla by saying "Humans should not do stupid things (like ignoring the cars warning)"..


they are expected to remain in control of the plane, not necessarily they do: https://timesofindia.indiatimes.com/india/Pilots-sleep-as-fl...


Uhhhhh, I think you are misinformed.

They aren't going to literally fall asleep, but much of the time pilots are reading a book and not directly paying attention in the same way that a driver is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: