At Tesla Autonomy Day, Elon Musk and other executives this afternoon explained the company’s self-driving plans, including a “Robotaxi” self-driving ride-sharing network next year. During Q&A, the CEO pointed out flaws in autonomous approaches taken by competitors. This implicitly includes Waymo, though he did not explicitly name those other companies.
Musk has long touted that Tesla’s self-driving advantage comes from a large fleet of vehicles already on the road that records situations and provides training data to improve neural networks. The company’s approach to autonomous vehicles is primarily focussed on computer vision, or using cameras — just like humans — to recognize and understand the world.
This is in contrast to other efforts, like Waymo, that rely on a suite of sensors. The Alphabet division uses a system comprised of LIDAR, vision, and radar, with CEO John Krafcik once equating this to how human beings work:
A single integrated system means that all the different parts of our self-driving technology work together seamlessly. Like a person’s own five senses, our sensors are more useful and more powerful when we put them all together.
LIDAR
The Tesla CEO has been very outspoken and critical of LIDAR usage for autonomy, going as far as calling it “lame” and a “fool’s errand” today.
“In cars, it’s freaking stupid. It’s expensive and unnecessary. And as Andrej was saying, once you solve vision, it’s worthless. So you have expensive hardware that is worthless on the car.”
According to Tesla, solving computer vision and having the ability to recognize all objects on the road by just leveraging real-time image analysis is enough. Musk also equated the expensive sensors to an appendage at one point during the presentation.
LIDAR works to create a 3D model of an object, but that does not necessarily mean the system will understand what it’s looking at. This “crutch” means that LIDAR-only systems do not know if something on the road is a plastic bag or a rubber tire. The result is a less information-rich environment to work with.
However, on the cost front, Waymo since 2017 has been using an in-house sensor suite. Moving away from off-the-shelf camera and radar components, the company was able to reduce the cost of LIDAR units from $75,000 in 2009 to $7,500. Earlier this year, it began selling the custom-built sensor to customers in robotics and security to make units more affordable to manufacture through the economies of scale.
Meanwhile, Waymo has said in the past that LIDAR allows it to “detect more objects and see them at a higher resolution.”
The detail we capture with our custom LiDAR is so high that not only can we detect pedestrians all around us, but we can tell which direction they’re facing. This is incredibly important as it helps us more accurately predict where someone will walk next.
Waymo’s vehicles do use computer vision, but its biggest argument for multiple sensors is low-lighting conditions where cameras alone do not work:
So, our custom vision system — which allows us to see things like traffic lights and stop signs — is comprised of 8 vision modules each using multiple sensors, plus an additional, forward-facing, super high resolution multi-sensor module, enabling 360-degree vision. With this resolution, we can detect small objects like construction cones far away even when we’re cruising down a road at high speed. And with a wide dynamic range we can see in a dark parking lot, or out in the blazing sun — or any condition in between.
HD mapping
Meanwhile, another critique by Musk was HD mapping, or creating a very high-resolution scan of an area before deploying cars. Waymo first builds a detailed picture of that area and categorizes “interesting features,” like driveways, fire hydrants, and intersections. By knowing what the “permanent features of the road,” Waymo can focus on moving objects like other vehicles and pedestrians.
This level of detail helps our car know exactly where it is in the world. As our cars drive autonomously on the road, our software matches what the car sees in real-time with the maps we’ve already built, allowing the car to know its position on the road to within 10cm of accuracy. That means we don’t have to rely on GPS technology, or a single point of data such as lane markings, to navigate the streets.
Waymo cars will detect construction, road closures, or other changes, and send them back to the fleet so all vehicles have an updated map. Musk considers high precision GPS maps a “really bad idea,” resulting in a “system [that] becomes extremely brittle” by being too dependent and not being able to adapt.
“We briefly barked up the tree of high precession lane lines and then realized that it was a huge mistake, and reversed it out. It’s not good.”
Simulation
Waymo especially is a big proponent of simulation. It often touts how its Carcraft system has driven seven billion miles in simulation, compared to 10 million miles on public roads as of last October. Musk argues that simulation alone cannot catch up to all the miles Tesla vehicles have driven due to the mass number of weird incidents that happen on the road.
“The world is very weird. It has millions of corner cases, and if somebody can produce a self-driving simulation that accurately matches reality that in itself is a monumental achievement of human capability. They can’t, there’s no way.”
FTC: We use income earning auto affiliate links. More.
Comments