A team of researchers from Ben-Gurion University of the Negev and Georgia Tech attempted to trick Tesla’s autopilot system into seeing “phantom objects” and taking them as real images. What they found out was it was really easy to pull off.
Listen beautiful relax classics on our Youtube channel.
A cheap projector system displaying false speed limit signs in trees or shining a Slenderman-like figure onto the road can actually force Autopilot to change behavior, adjusting speed to the “road signs” and slowing down for what it thinks might be a pedestrian (nevermind the fact that the car still runs over the projection).
This would mean that computer vision still has a long way to go to improve their perceptual capabilities. Those who have cars equipped with the Tesla autopilot system, or a similar system to it, however, are prone to danger.
“We show how attackers can exploit this perceptual challenge to apply phantom attacks … without the need to physically approach the attack scene…
More details about this study over at Popular Mechanics.
(Image Credit: Cyber Security Labs @ Ben Gurion University/ YouTube)