Phantom of the ADAS – Phantom Attacks Against Advanced Driving Assistance Systems

Pierluigi Paganini January 29, 2020

Researchers investigate a new perceptual challenge that causes the ADAS systems and autopilots of semi/fully autonomous to consider depthless objects (phantoms) as real. 


The absence of deployed vehicular communication systems, which prevents the advanced driving assistance systems (ADASs) and autopilots of semi/fully autonomous cars to validate their virtual perception regarding the physical environment surrounding the car with a third party, has been exploited in various attacks suggested by researchers. Since the application of these attacks comes with a cost (exposure of the attacker’s identity), the delicate exposure vs. application balance has held, and attacks of this kind have not yet been encountered in the wild. In this paper, we investigate a new perceptual challenge that causes the ADASs and autopilots of semi/fully autonomous to consider depthless objects (phantoms) as real. We show how attackers can exploit this perceptual challenge to apply phantom attacks and change the abovementioned balance, without the need to physically approach the attack scene, by projecting a phantom via a drone equipped with a portable projector or by presenting a phantom on a hacked digital billboard that faces the Internet and is located near roads. We show that the car industry has not considered this type of attack by demonstrating the attack on today’s most advanced ADAS and autopilot technologies: Mobileye 630 PRO and the Tesla Model X, HW 2.5; our experiments show that when presented with various phantoms, a car’s ADAS or autopilot considers the phantoms as real objects, causing these systems to trigger the brakes, steer into the lane of oncoming traffic, and issue notifications about fake road signs. In order to mitigate this attack, we present a model that analyzes a detected object’s context, surface, and reflected light, which is capable of detecting phantoms with 0.99 AUC. Finally, we explain why the deployment of vehicular communication systems might reduce attackers’ opportunities to apply phantom attacks but won’t eliminate them.

The Perceptual Challenge

Would you consider the projection of the person and road sign real?
Telsa considers the projected character as a real person. 
Mobileye 630 PRO considers the projected road sign as a real road sign.

Phantom adas


A phantom is a depthless object intended at causing ADASs and autopilot systems to perceive the object and consider it real.
The object can be an obstacle (e.g., person, car, truck, motorcycle), lane, or road sign.

For example, the picture below presents a projected phantom of a car that was detected by the Tesla (HW 2.5) which considered it a real car.

Phantom adas

Phantoms can also be embedded in advertisements projected on digital billboards located near roads. Try to spot the phantom presented for 125 milliseconds in this advertisement.

Phantoms can also be projected via a portable projector mounted to a drone.
Try to spot the phantom projected for 125 milliseconds from the drone.

Phantoms can also cause the Tesla Model X (HW 2.5) to brake suddenly. See how the car reduces its speed from 18 MPH to 14 MPH as a result of a phantom that is detected as person.

Phantoms can also cause the Tesla Model X’s (HW 2.5) autopilot to deviate to the lane of oncoming traffic. See how the car crosses the yellow line as a result of the phantom lanes.


Are phantoms bugs?

No. Phantoms are definitely not bugs. They are not the result of poor code implementation in terms of security. They are not a classic exploitation (e.g., buffer overflow, SQL injections) that can be easily patched by adding an “if” statement. They reflect a fundamental flaw of models that detect objects that were not trained to distinguish
between real and fake objects.

Why are phantom attacks so dangerous?

Previous attacks:

  1. Necessitate that the attackers approach the attack scene in order to manipulate an object using a physical artifact (e.g., stickers, graffiti) or to set up the required equipment, acts that can expose attackers’ identities.
  2. Require skilled attackers (experts in radio spoofing or adversarial machine learning techniques). 
  3. Required full knowledge of the attacked model.
  4. Leave forensic evidence at the attack scene.
  5. Require complicated/extensive preparation (e.g., a long preprocessing phase to find an evading instance that would be misclassified by a model).

Phantom attacks:

  1. Can be applied remotely (using a drone equipped with a portable projector or by hacking digital billboards that face the Internet and are located close to roads), thereby eliminating the need to physically approach the attack scene, changing the exposure vs. application balance.
  2. Do not require any special expertise.
  3. Do not rely on a white-box approach. 
  4. Do not leave any evidence at the attack scene.
  5. Do not require any complex preparation.
  6. Can be applied with cheap equipment (a few hundred dollars).

Why does Tesla consider phantoms real obstacles?

We believe that this is probably the result of a “better safe than sorry” policy that considers a visual projection a real object even though the object is not detected by other sensors (e.g., radar and ultrasonic sensors).

Can phantoms be classified solely based on a camera?

Yes. By examining a detected object’s context, reflected light, and surface, we were able to train a model that accurately detects phantoms (0.99 AUC).

​Will the deployment of vehicular communication systems eliminate phantom attacks?

No. The deployment of vehicular communication systems might limit the opportunities attackers have to apply phantom attacks, but won’t eliminate them.

​Did you disclosed your findings to Mobileye and Tesla?

Yes. We kept Tesla and Mobileye updated via a series of mails sent from early May to October 19.

Additional technical details, including the research paper, are available here:

[adrotate banner=”9″] [adrotate banner=”12″]

Pierluigi Paganini

(SecurityAffairs – Phantom attacks, ADAS)

[adrotate banner=”5″]

[adrotate banner=”13″]

you might also like

leave a comment