Hackers Trick Tesla S’s Autopilot Into Making Obstacles ‘Disappear’ And Crash

Researchers fool Tesla Model S AutoPilot System

The debate surrounding the reliability of autonomous vehicles has been going on for a long time. If that was not enough, the potential danger associated with hackers successfully influencing the underlying software of a self-driving car has cropped up.

Researchers from the University of South Carolina, Zhejiang University in China, and Qihoo 360 (a Chinese security firm) has found a potential flaw in Tesla’s AutoPilot semi-autonomous driving system. The team was able to exploit weaknesses in the system that lead researcher and USC Professor Wenyuan Xu said “highly motivated people” could use “to cause personal damage or property damage.”

The group will report its findings at the DEFCON Hacking Conference in Las Vegas, which used off-the-shelf products to carry out the hack.

According to Wired, “Tesla’s autopilot detects the car’s surroundings three different ways: with radar, ultrasonic sensors, and cameras. The researchers attacked all of them, and found that only their radar attacks might have the potential to cause a high-speed collision. They used two pieces of radio equipment—a $90,000 signal generator from Keysight Technologies and a VDI frequency multiplier costing several hundred dollars more—to precisely jam the radio signals that the Tesla’s radar sensor, located under its front grill, bounces off of objects to determine their position. The researchers placed the equipment on a cart in front of the Tesla to simulate another vehicle.”

“When there’s jamming, the ‘car’ disappears, and there’s no warning,” Prof. Xu says.

The ultra-sonic sensors are a lot easier to target, although at lower speeds. In order to tamper with the Model S’ self-parking and summon features, the researchers used an estimated $40 worth equipment – an Arduino computer, an ultra-sonic transducer, and a function generator. The researchers plan to demonstrate how they can fool a self-parking Tesla into completely missing an obstacle in its path.

The jamming of the Model S ultrasonic sensor, deployed in Tesla’s for discovering nearby objects for actions such as self-parking, saw sound signals sent from a DIY ultrasonic jammer based on an Arduino board. They would swamp the system to subdue the real echoes bouncing off an object to remove them from the autopilot’s vision. If they had let it, the Tesla would collide with the obstacles.

However, the Tesla’s camera systems were the most resistant to attacks. In order to blind the cameras, the researchers shined lasers and LEDs at them. Also, Xu’s team was able to kill a few pixels on the camera sensors. But AutoPilot simply blacked out and warned its driver to take the wheel when they tried to block the camera.

Based on the comments of Prof. Xu, it’s more significant to compel Tesla to add protections to AutoPilot instead of simply breaking the system.

“I don’t want to send out a signal that the sky is falling, or that you shouldn’t use autopilot. These attacks actually require some skills,” Xu told Wired. “[Tesla] need to think about adding detection mechanisms as well. If the noise is extremely high, or there’s something abnormal, the radar should warn the central data processing system and say ‘I’m not sure I’m working properly.'”

Appreciating Prof. Xu and his team’s work, Tesla told Wired in a statement that “We have reviewed these results with Wenyuan’s team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers.”

Source: Wired

Subscribe to our newsletter

To be updated with all the latest news

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Subscribe to our newsletter

To be updated with all the latest news

Read More

Suggested Post