The announcement that Tesla is now equipping every car coming off its assembly line with what the automaker believes to be the necessary hardware to enable full self-driving capability has been met with skepticism among industry watchers and left the market mostly unimpressed.
Tesla’s stock price fell by 2% after the announcement, which indicates that the market either has doubts about Tesla achieving level 5 full autonomy with the new hardware or it doesn’t understand the implications of having full autonomous capabilities. The latter is unlikely considering the value of self-driving technology for automakers has been mostly understood for the past few years now.
The former is more likely the case here since we have been told that lidar sensors are required for full autonomy and redundancy ever since self-driving vehicle development has become mainstream in the industry, and Tesla is almost famously not using the laser-based sensor.
The decision is the latest example of Tesla Autopilot team’s motto to ‘not to let the perfect be the enemy of the better’. The idea is that by aiming for unattainable perfection with autonomous driving systems, you are delaying deployment of systems better than human drivers and therefore potentially “killing people with statistics”.
That’s why Tesla is getting to market first by benchmarking its self-driving car against human drivers and not other systems, like lidar-based systems. If they can prove to have a system reliably better at driving than humans, even if it’s just 50% or %100 better (not a perfect system as companies like Volvo have been aiming for), they would have something infinitely more valuable than any other self-driving program since it would actually be in vehicles available today and therefore, it would be making a difference.
Volvo is a great comparison point since the company is at the polar opposite by literally aiming for a perfect system. The Sweden-based automaker has been heavily investing in autonomous driving technologies and partnered with Uber and Autoliv to bring it to market. They see autonomous driving as an important part of their goal to have no serious injury or death in their new cars by 2020.
The current hardware they are testing for full autonomy through their ‘Drive Me’ program consists of 360-degree camera coverage, including a triple front-facing camera, like Tesla’s new sensor suite, 360-degree ultrasonics, again like Tesla, both automakers also use Nvidia’s Drive PX 2 computing platform, but that’s where the similarities end, Volvo has 7 radar antennas against Tesla’s single antenna, and of course, it has a lidar sensor.
Volvo’s sensor and hardware suite undoubtedly provides more redundancy than Tesla’s, but it is also much more expensive and complex to integrate into a vehicle with sensor fusion, which is why Volvo doesn’t expect the version for full autonomy to be available for another 4 years.
In the meantime, Tesla is getting to market by benchmarking its system against human drivers and it believes that its current hardware suite combined with improvements to its ‘Tesla Vision’ software can be at least twice safer than human drivers, which is far from perfect, but it should still have the right to drive.
As it turns out, humans are not that great at driving cars. We kill over 1 million people with our driving mistakes per year globally. We really need to come off our pedestal and realize that we don’t have any physical attribute that makes us particularly good at monitoring our driving environment and reacting to it, except maybe our brain, which is not bad at processing images.
That’s why Tesla CEO Elon Musk is referring to self-driving as a software problem, not a hardware problem, and one that Tesla hopes to solve throughout the next year by improving on its ‘Tesla Vision’ software, aka the brain of Tesla’s autonomous driving system.
As for the hardware, Tesla determined that its sensors are more accurate than the sensors on a human-based driving system (that’s your eyes) since they are constantly looking at 6 different directions and can detect objects hidden behind other objects thanks to its radar:
Redundancy is another matter that is hard to grasp. Of course, fully redundant systems are preferable, but again: “don’t let the perfect be the enemy of the better”. Tesla determined that the failure rate of its sensors are actually better than the failure rate of human eyes, and the same goes for its onboard supercomputer powered by Nvidia’s Drive PX 2 computing platform versus human brains.
If the failure rate in time of the hardware is better than a human and it is providing more information and on wavelengths not visible to human, then your hardware suite can theoretically support a self-driving system better than a human.
That’s the hardware installed in all Tesla vehicles from now on.
Where Tesla will have a great advantage over human driver is experience. Tesla currently builds and delivers cars at a rate of ~25,000 per quarter. Assuming the company doesn’t grow during the next year, which is unlikely based on its history but let’s assume anyway just for the sake of simplicity, it will have at least 100,000 cars with its new hardware suite for full autonomy on the road by this time next year.
By running its ‘Tesla Vision’ software on all those cars in ‘shadow mode’, just to test the system and collect data, Tesla’s software will get more driving experience in one day than the average human driver in two lifetimes.
In conclusion, I think that the idea that Tesla can’t build a self-driving system safer than a human on its new hardware suite needs to be dispelled, but while it can be done, the company still needs to do it and hopefully within the timeline set by CEO Elon Musk during the announcement last week.
It would be useful to have a visual representation of what Tesla Vision can see from the data collected through the new hardware suite. Hopefully, Tesla will release more content to show the capabilities of the system, not only like the video below but also of the inner workings and let the public gain some confidence in it.