close
close
Emergency vehicle lighting can interfere with a car’s automated driving system

Tesla, which disbanded its public relations team in 2021, did not respond to WIRED’s request for comment. The camera systems the researchers used in their tests were manufactured by HP, Pelsee, Azdome, Imagebon, and Rexing; None of these companies responded to WIRED’s request for comment.

Although the NHTSA acknowledges problems with “some advanced driver assistance systems,” researchers are clear: They’re not sure what this observed emergency light effect has to do with Tesla’s Autopilot problems. “I don’t claim to know why Teslas crash into emergency vehicles,” Nassi says. “I don’t know if that’s still a weak point.”

The researchers’ experiments were also exclusively about image-based object recognition. Many automakers use other sensors, including radar and lidar, to detect obstacles on the road. A smaller group of technology developers – including Tesla – argue that image-based systems, supplemented by sophisticated artificial intelligence training, can enable not only driver assistance systems but also fully autonomous vehicles. Last month, Tesla CEO Elon Musk said the automaker’s vision-based system would enable self-driving cars next year.

In fact, how a system responds to flashing lights depends on how individual automakers design their automated driving systems. Some may choose to “tune” their technology to respond to things that are not entirely sure that they are actually obstacles. In extreme cases, this choice could lead to “false positives,” where a car could brake sharply in response to a toddler-shaped box, for example. Others may tune their technique so that it only responds when it is very certain that what it sees is an obstacle. On the other extreme, this decision could result in the car not braking to avoid a collision with another vehicle because it does not realize that it is a completely different vehicle.

Researchers from BGU and Fujitsu have developed a software solution to the emergency flasher problem. It’s called “Caracetamol” – a portmanteau of “car” and the painkiller “paracetamol” – and was designed to avoid the problem of “seizure” by being specifically trained to detect vehicles with emergency flashing lights. The researchers say it improves the accuracy of object detectors.

Earlence Fernandes, an assistant professor of computer science and engineering at the University of California, San Diego, who was not involved in the research, said it seemed “good.” “Just as a person can be temporarily blinded by the hazard lights, a camera in an advanced driver assistance system can also be temporarily blinded,” he says.

For researcher Bryan Reimer, who studies vehicle automation and safety at the MIT AgeLab, the paper points to larger questions about the limitations of AI-based driving systems. Automakers need “repeatable, robust validation” to uncover blind spots like vulnerability to emergency lights, he says. He fears that some automakers are “pushing the technology faster than they can test it.”

Leave a Reply

Your email address will not be published. Required fields are marked *