A series of accidents in which Tesla vehicles collided with emergency vehicles despite the autopilot system has left the US traffic agency NHTSA with no peace. She is now taking a closer look at the software.

The US traffic agency NHTSA has been dealing with the autopilot installed in Tesla models for several years. Now she has expanded her investigation into a series of rear-end collisions. Since the investigation began in August, she has identified six other incidents in which Teslas with the Autopilot system engaged crashed into emergency vehicles parked on the side of the road. Originally, there were eleven such accidents. The most recent crash happened in January.

The investigations are now to be expanded, among other things, with the evaluation of additional data, as the traffic authority NHTSA announced in a document published on Thursday (local time). She also looks at over 100 autopilot accidents that didn’t involve emergency vehicles. It should also be examined to what extent the system of the electric car manufacturer increases the risk of human error. The NHTSA sees signs that in around 50 of the accidents investigated, the drivers reacted inadequately to the traffic situation.

Investigations already after a fatal accident in 2016

Tesla itself points out to customers that the autopilot is only an assistance system and therefore the person in the driver’s seat must keep their hands on the steering wheel at all times. He should always be ready to take control. Nevertheless, it happens again and again that drivers rely completely on the autopilot system. Tesla tightened its safety measures a few years ago: The software notices when the driver’s hands are not on the wheel and emits warning tones after a short time. According to the NHTSA, the current Autopilot investigation involves an estimated 830,000 vehicles of all four current model series from the years 2014 to 2022.

The NHTSA had already examined the autopilot system after a fatal accident in 2016. At that time, a driver died after his Tesla crashed under the trailer of a semi truck crossing the road. The NHTSA concluded that the system worked correctly, within its capabilities, but that the human driver relied on it too much. The autopilot system had not recognized the trailer with its white front and had not braked. The driver didn’t respond either.

Musk: Tesla Autopilot makes driving safer

In the current investigation, the NHTSA pointed out that in all rear-end collisions, the fire and ambulance vehicles were clearly identified, among other things, thanks to the flashing lights being switched on. Tesla released a software update in September last year, thanks to which the autopilot should recognize the vehicles with their distinctive flashing lights even in difficult lighting conditions. The NHTSA then questioned why the update was not declared a recall.

Tesla boss Elon Musk always emphasized that the autopilot makes driving safer and helps to avoid accidents. For several months, the company has been letting selected beta testers try out the next version of the software with more functions for city traffic. There are many videos circulating on the web in which the software makes mistakes. The NHTSA has already requested information about the test on public roads.

Since February, the NHTSA has also been investigating Tesla for reports of sudden braking. The trigger was 354 complaints within nine months because the autopilot system suddenly and unexpectedly activated the brakes. The authority also requested information from other car manufacturers about their assistance systems.