At this point, 8 years after Google put a spotlight on self-driving technology, there are over 2 dozens somewhat serious companies with autonomous driving programs at different stages of development.

Tesla’s Autopilot is among the most well-known and arguably one of the most exciting since it’s already powering features in vehicles owned by customers. For better or worse, it lets people experiment with some aspect of it and through those experimentations, we now get a look at the Autopilot’s debugging mode – giving insights into the back-end of Tesla’s semi-autonomous system.

Tesla’s second generation Autopilot is quite complex, but in short, it consists a computer vision technology called Tesla Vision that uses images fed from 8 cameras around the vehicle (currently mainly the 3 front-facing cameras) in order to steer the vehicle with the help of GPS and radar data.

With the data gathered through its entire fleet, Tesla is also building “high-precision maps” and its vehicles can download “tiles” based on their location and use them to better autonomously steer itself.

At any given time, Autopilot uses one of these technologies or a fusion of them in order to operate. The Tesla Vision system can also use either a lead vehicle or detect lane markings in order to steer.

Tesla’s Autopilot debug mode, which Tesla Motors Club member ‘verygreen’ managed to hack, tells us exactly which of those metrics the system is using to take its decisions. He posted his latest discoveries from the system in an interesting thread on the forum.

It shows some Autopilot settings currently unavailable to Tesla owners (picture credits to ‘verygreen’):

Of course, ‘Augmented Vision’ caught everyone’s attention, especially after all the talk about heads-up displays, but the options in the tab is not telling us a lot about it:

verygreen noted that it should “be displaying a video feed of some sort”, but he can’t make it work on his car.

As we previously noted in reports about Tesla owners hacking their vehicles, Tesla has one software build that it pushes to all its vehicles which is then limited on the user’s end. For example, a development vehicle in Tesla’s internal fleet could have the same software build as verygreen’s but with access to the functions that he is seeing in the debug mode.

When driving with the debug mode, he can see in real-time the information that Autopilot is using, like the GPS data and map tiles:

He even posted a video of the debug mode as he was driving his Model S. You can see what the Autopilot is seeing in real-time: