Fake Signals Fool Self-Driving Cars Into Falsely Seeing Danger

Google Car Can Be Fooled By Fake Signals

Google Car Can Be Fooled By Fake Signals

Fake signals fool self-driving cars into thinking there is danger ahead when there isn’t. The vulnerability could halt self-driving cars in the middle of the road when they believe another car or person has appeared in view suddenly. It would only take one prankster to make a self-driving car swerve or stop to avoid a non-existent threat.

Hackers are not as dangerous a threat to self-driving cars as someone nearby with a handful of inexpensive electronics.

Security researcher Jonathan Petit, principal scientist at software security company Security Innovation, has determined that you can fool LIDAR (the laser ranging common on autonomous vehicles) by sending “echoes” of fake cars and other objects through laser pulses. Petit’s attack works by fooling a car, rather than exposing its weak security.

All one needs is a low-power laser, a basic computing device (an Arduino kit or Raspberry Pi is enough) and the right timing — one doesn’t even need good aim. Petit managed to spoof objects from as far as 100 meters (330 feet) away in his proof-of-concept attack. He didn’t even need to accurately focus on the Lidar unit with his laser beam.

Petit has unearthed gaping security vulnerability in LIDAR sensors

Fake Signals Fool Self-Driving Cars LIDAR Systems Into Falsely Seeing Danger

Fake Signals Fool Self-Driving Cars LIDAR Systems Into Falsely Seeing Danger

Petit began by recording pulses from a commercial Ibeo Lux Lidar unit. Discovering the pulses weren’t encoded or encrypted, he could simply use them at a later time to fool the unit into believing objects were there when they weren’t. “The only tricky part was to be synchronized, to fire the signal back at the Lidar at the right time,” said Petit. “Then the Lidar thought that there was clearly an object there.”

Vulnerability May Be Plugged

There’s no guarantee that this will be a major issue if and when self-driving cars become commonplace. Petit’s technique works only so long as LIDAR units’ pulses aren’t encrypted or otherwise obscured. While that’s true of many commercial systems at the moment, it’s possible that production-ready vehicles will lock things down.

Wake Up Call For Self-Driving Car Manufacturers

This latest car hack is yet another signal that car makers have a lot of work ahead of them if they’re going to secure their robotic rides. Industries—like auto manufacturers—usually disassociated from data security now need to start taking it seriously.

Comment on this article |  


About Author

Bill is an Associate Producer at Horsepower Broadcasting as well as our Operations Analyst. He personally oversees most all of the myriad interviews with our automotive celebrity guests. He handles scheduling, contacts, press releases, press passes and everything in between. His keen intellect is awe inspiring and he is a true academician in every sense of the term.

Many of our business related blogs and posts are created and written by him as the business category is what Bill is most familiar with. He has a wide range of interests in business related subjects including marketing, sales, finance, motivation, leadership, banking, technology and leading edge thinking.

His input and contributions to the show are invaluable and we are grateful he is a part of our team.

Leave A Reply

Listen to our podcast!AppleAndroid

We hate spam as much as you do and will never share your info.