Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m always amazed when people talk about using Tesla self driving features. Every time I get in a Tesla the driver has the screen showing this little block version of how it perceives what’s around it. It always misses stuff, cars jump in and out, things randomly vanish. Maybe that UI is buggy or doesn’t really show what the self driving would use, but I can’t imagine seeing that and then wanting to let the car drive itself.


There's a probabilistic nature to how the image sensor data is interpreted.

The neural network processes data to classify objects and predict what each object is. It doesn't always have a perfect static representation of each object. It's constantly updating its understanding based on the image sensor data. That jiggly jitter you see is basically the system refining its prediction in real-time.

Or at least that's how I understand it...


Other companies have put quite a lot of effort into perception stability because it has a large effect on the downstream quality. It's hard to estimate higher order derivatives like velocity and acceleration well if your position estimates are unstable.


> That jiggly jitter you see is basically the system refining its prediction in real-time.

GP is saying that a "car that jumps in and out" and "things that randomly vanish" do not look very refined. Just like missing a freaking moving train doesn't look very refined.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: