Fake signals fool self-driving cars into thinking there is danger ahead when there isn’t. The vulnerability could halt self-driving cars in the middle of the road when they believe another car or person has appeared in view suddenly. It would only take one prankster to make a self-driving car swerve or stop to avoid a non-existent threat.
Hackers are not as dangerous a threat to self-driving cars as someone nearby with a handful of inexpensive electronics.
Security researcher Jonathan Petit, principal scientist at software security company Security Innovation, has determined that you can fool LIDAR (the laser ranging common on autonomous vehicles) by sending “echoes” of fake cars and other objects through laser pulses. Petit’s attack works by fooling a car, rather than exposing its weak security.
All one needs is a low-power laser, a basic computing device (an Arduino kit or Raspberry Pi is enough) and the right timing — one doesn’t even need good aim. Petit managed to spoof objects from as far as 100 meters (330 feet) away in his proof-of-concept attack. He didn’t even need to accurately focus on the Lidar unit with his laser beam.
Petit has unearthed gaping security vulnerability in LIDAR sensors
Petit began by recording pulses from a commercial Ibeo Lux Lidar unit. Discovering the pulses weren’t encoded or encrypted, he could simply use them at a later time to fool the unit into believing objects were there when they weren’t. “The only tricky part was to be synchronized, to fire the signal back at the Lidar at the right time,” said Petit. “Then the Lidar thought that there was clearly an object there.”
Vulnerability May Be Plugged
There’s no guarantee that this will be a major issue if and when self-driving cars become commonplace. Petit’s technique works only so long as LIDAR units’ pulses aren’t encrypted or otherwise obscured. While that’s true of many commercial systems at the moment, it’s possible that production-ready vehicles will lock things down.
Wake Up Call For Self-Driving Car Manufacturers
This latest car hack is yet another signal that car makers have a lot of work ahead of them if they’re going to secure their robotic rides. Industries—like auto manufacturers—usually disassociated from data security now need to start taking it seriously.