Skip to main content

Hackers show how Tesla Autopilot can be tricked

Keen Security Lab, a group of security researchers backed by Tesla investor Tencent, released a new report about Tesla’s Autopilot system and how to trick it.

We have previously reported on Keen Lab’s multiple “white hat” efforts to hack Tesla vehicles.

They have managed to take control of Tesla vehicles on several occasions and their research has led to Tesla upgrading its software security.

Now they are turning their attention to Tesla’s Autopilot system in a new research paper released today.

Instead of directly hacking the Advanced Driver Assistance System (ADAS) software, Keen Lab explored how the system could be tricked with misleading visual inputs.

At first, they simply tried to trick the automatic wipers, which are powered by Autopilot’s computer vision system and cameras, by showing images of water to the front-facing camera – triggering the system:

Tesla wasn’t impressed by Keen Lab’s demonstration and commented:

“This research was demonstrated by displaying an image on a TV that was placed directly in front of the windshield of a car. This is not a real-world situation that drivers would face, nor is it a safety or security issue. Additionally, as we state in our Owners’Manual, the ‘Auto setting [for our windshield wipers] is currently in BETA.’ A customer can also elect to use the manual windshield wiper setting at any time.”

The team ramped things up by trying something a lot more dangerous; using stickers on the road to mess with the lane recognition system:

“Tesla Autopilot recognizes lanes and assists control by identifying road traffic markings. Based on the research, we proved that by placing interference stickers on the road, the Autopilot system will capture these information and make an abnormal judgement, which causes the vehicle to enter into the reverse lane.”

Here’s a video of how the Autopilot system reacts to the trick:

Tesla also issued an answer regarding this test – saying that it would be up to the driver to correct the situation:

“In this demonstration the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use. This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

Here’s Keen Security Lab’s full Autopilot report:

[scribd id=404017043 key=key-nI9c3iDLrtnP1x8Nz5jI mode=scroll]

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

Comments

Author

Avatar for Fred Lambert Fred Lambert

Fred is the Editor in Chief and Main Writer at Electrek.

You can send tips on Twitter (DMs open) or via email: fred@9to5mac.com

Through Zalkon.com, you can check out Fred’s portfolio and get monthly green stock investment ideas.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications