Tesla has released a new video of what Autopilot’s neural net can see, as well as new images of other systems powering Autopilot, including a simulator with a Tesla Cybertruck in it and more.
If we are to believe Elon Musk, 2020 is going to be the year Tesla releases its full self-driving system built on Autopilot.
The company has made a lot of progress in recent years and more recently with the release of Tesla v10, but there’s still a lot of work to be done.
In order to do that work, Tesla is growing its already large Autopilot and AI team led by Andrej Karpathy, Tesla’s director of AI and computer vision.
As part of their hiring effort, the automaker created a new landing page for Autopilot team recruitment, and it includes some interesting comments and images about the program:
Tesla describes its effort in autonomy and artificial intelligence:
We develop and deploy autonomy at scale. We believe that an approach based on advanced AI for vision and planning, supported by efficient use of inference hardware is the only way to achieve a general solution to full self-driving.
They also list all the different departments that engineers can work at within the Autopilot and self-driving team:
- Hardware
- Neural Networks
- Autonomy Algorithms
- Code Foundations
- Evaluation Infrastructure
When it comes to the hardware, Tesla released its Autopilot 3.0 Self-Driving computer last year and claimed a factor of 21 improvement in frame-per-second processing versus the previous-generation Tesla Autopilot hardware.
As soon as Tesla announced the new hardware, it also talked about even better future generations, and now it is trying to attract new engineers to work on them:
Build silicon chips that power our full self-driving software from the ground up, taking every small architectural and micro-architectural improvement into account while pushing hard to squeeze maximum silicon performance-per-watt. Perform floor-planning, timing, and power analyses on the design. Write robust, randomized tests and scoreboards to verify functionality and performance. Implement compilers and drivers to program and communicate with the chip, with a strong focus on performance optimization and power savings. Finally, validate the silicon chip and bring it to mass production.
Tesla is also actively recruiting engineers and programmers to help them develop their neural net.
On the landing page, Tesla released a new video of what the Autopilot neural net can see:
In a recent interview on the Third Row Tesla podcast, CEO Elon Musk said that Tesla is currently undergoing “a significant foundational rewrite in the Tesla Autopilot.”
As part of the rewrite, Musk says that the “neural net is absorbing more and more of the problem.”
Building the code is one thing, but Tesla also has to extensively test it:
Build open- and closed-loop, hardware-in-the-loop evaluation tools and infrastructure at scale, to accelerate the pace of innovation, track performance improvements and prevent regressions. Leverage anonymized characteristic clips from our fleet and integrate them into large suites of test cases. Write code simulating our real-world environment, producing highly realistic graphics and other sensor data that feed our Autopilot software for live debugging or automated testing.
The automaker released an interesting image for its evaluation infrastructure recruitment with a glimpse at its simulator:
In the top left image, you see a glimpse at Tesla’s simulator, which appears to use many of Tesla’s vehicles, including the new Cybertruck.
All applicants have to do to apply is give their name, email, and talk about some of the “exceptional work” they have done.
Musk says that Tesla will soon release the “feature-complete” version of its Autopilot system to handle intersection. He believes it will lead to a full self-driving system by the end of the year.
FTC: We use income earning auto affiliate links. More.
Comments