Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Meh, there's no way their hardware is equivalent to what Google or the like put on their self driving cars. So I don't think any algorithm can realistically learn everything based on highly limited information. I feel like it's much more likely to enable some safety functionality than anything approaching real autonomy.


@toomuchtodo: For the Google vehicles, LIDAR is the primary sensor. They're not identical in the slightest.


@agildehaus

Elon and Tesla believe that their array of ultrasound sensors, GPS, and front facing camera will be sufficient without the addition of an expensive LIDAR roof mounted scanner. Google has under 100 self-driving vehicles in the field collecting data; Tesla is now making 2500 vehicles/week that will be collecting data.

My comment stands. The sensor networks are the same except for the Velodyne LIDAR device. Google has Street View data, Tesla will have a substantial fleet of vehicles collecting data towards improving their autopilot system.

Vast amount of mapping data do you no good if you have to test your scenarios in simulated environments (as Google requested they be allowed to do to the California DMV).


Their hardware is almost identical to Google's, except for the cloud point laser scanner on top of the vehicle.


For the Google car, LIDAR is the absolute primary source of accurate data. It's like saying that a car without an engine has almost identical hardware as it has everything else.


Can you describe what can't be detected without a LIDAR scanner and without the Tesla's onboard ultrasound sensors?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: