fbpx
 
Home / News, Videos & Publications / News / Homeland & Cyber Security /

Phantom Images Can Fool Tesla’s Autopilot

Phantom Images Can Fool Tesla’s Autopilot

October 12, 2020

Homeland & Cyber Security

Wired — Safety concerns  over automated driver-assistance systems like Tesla’s usually focus on what the car can’t see, like the white side of a truck that one Tesla confused with a bright sky in 2016, leading to the death of a driver.

But a group of BGU researchers has been focused on what autonomous driving systems might see that a human driver doesn’t—including “phantom” objects and signs that aren’t really there, which could wreak havoc on the road.

The BGU researchers have spent the last two years experimenting with those “phantom” images to trick semi-autonomous driving systems.

They previously revealed that they could use split-second light projections on roads to successfully trick Tesla’s driver-assistance systems into automatically stopping without warning when its camera sees spoofed images of road signs or pedestrians.

In new research, they’ve found they can pull off the same trick with just a few frames of a road sign injected on a billboard’s video. And they warn that if hackers hijacked an internet-connected billboard to carry out the trick, it could be used to cause traffic jams or even road accidents while leaving little evidence behind.

Dr. Yisroel Mirsky

“The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous,” says Dr. Yisroel Mirsky, of BGU’s Cybersecurity Research Center and a [postdoc] researcher at Georgia Tech. “The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”

In their first round of research, published earlier this year, the team projected images of human figures onto a road, as well as road signs onto trees and other surfaces.

In this latest set of experiments, the researchers injected frames of a phantom stop sign on digital billboards, simulating what they describe as a scenario in which someone hacked into a roadside billboard to alter its video.

They found that they could again trick a Tesla or cause the same Mobileye device to give the driver mistaken alerts with just a few frames of altered video.

They also experimented with finding spots in a video frame that would attract the least notice from a human eye, going so far as to develop their own algorithm for identifying key blocks of pixels in an image so that a half-second phantom road sign could be slipped into the “uninteresting” portions.

Ben Nassi

The team speculates that the phantom attacks could be carried out as an extortion technique, as an act of terrorism, or for pure mischief. “Previous methods leave forensic evidence and require complicated preparation,” says BGU Ph.D. student Ben Nassi. “Phantom attacks can be done purely remotely, and they do not require any special expertise.

“If you implement a system that ignores phantoms if they’re not validated by other sensors, you will probably have some accidents,” says Nassi. “Mitigating phantoms comes with a price.”

Read more in Wired >>