Self-driving cars can be fooled by displaying virtual objects

Self-driving cars can be fooled by displaying virtual objects
  

Self-driving cars are one of the coolest innovations of the 21st century and for good reason, you could finally sleep on your daily commute (not recommended though).


However, as with any piece of technology, it comes with its flaws, many of which have been discovered before but there is one that has recently come to light.


See: Self-Driving Cars Can Be Tricked Into Misreading Street Signs


Explored by a group of researchers from the Ben-Gurion University of the Negev; the tests were done on 2 commercial advanced driver-assistance systems (ADASs) belonging to Tesla X (versions HW2.5 and HW 3.0) and Mobileye 630 in which “phantom” objects were displayed in front of the 2 vehicles.


These objects though were not any real objects and were instead merely perceptions of them but despite this, they caused the cars to incorrectly detect them as real objects and stop.

Examples of how this was done include a virtual road sign along with an image of a pedestrian displayed using a projector or a digital billboard. This led Tesla to stop in 0.42 seconds whereas Mobileye 360 stopped in 0.125 seconds at a much quicker rate.



Self-driving cars can be fooled by displaying virtual objects



This can be used maliciously by attackers in order to cause traffic jams and abrupt stops which could result in accidents.


In response to this, the researchers believe that measures should be deployed to guard against this by using a camera sensor that would be able to detect when a ..

Support the originator by clicking the read the rest link below.