Tesla Autopilot and Full Self-Driving Linked to Hundreds of Crashes and Dozens of Deaths – NHTSA

by alex

The Technology section is published with the support of Favbet Tech

Tesla Autopilot и Full Self-Driving причастны к сотням аварий и десяткам смертельных случаев — NHTSA

The US National Highway Traffic Safety Administration (NHTSA) investigated 956 accidents involving which became Tesla electric vehicles with Autopilot and Full Self-Driving (FSD) functions. Moreover, the investigation concerned only incidents that occurred between January 2018 and August 2023. Overall there were more accidents.

NHTSA began an investigation after several incidents in which Tesla vehicles crashed into stationary emergency vehicles parked on the side of the road. Most of these incidents occurred after dark when the vehicle's software ignored safety precautions, including warning lights, strobe lights, cones, and arrows.

These accidents, some of which also involved other vehicles, killed 29 people. There were also 211 crashes in which the Tesla's “frontal plane struck a vehicle or obstacle in the path.” These accidents, which were often the most serious, resulted in 14 deaths and 49 injuries.

In its investigation, the agency found that Autopilot—and, in some cases, FSD—was not designed to support driver engagement. Tesla says it warns its customers to be alert when using Autopilot and FSD, meaning keep their hands on the wheel and eyes on the road. But NHTSA says that in many cases, drivers become overly complacent and lose focus. And when the need to react came, it was often too late.

The agency found that in 59 crashes, Tesla drivers had enough time, “5 seconds or more,” to react before crashing into another object. In 19 of these crashes, the hazard was visible for 10 seconds or more before impact. By reviewing crash logs and data provided by Tesla, NHTSA found that in most crashes analyzed, drivers did not brake or steer to avoid the danger.

READ
Another analogue of the Toyota Alphard. Luxury minivan Denza D9 is coming to Russia, the dealer promises the same price as in China

NHTSA also compared Tesla's Level 2 (L2) automation features to products available in other companies' vehicles. Unlike other systems, Autopilot removes the driver from control rather than assists the driver. This “discourages” drivers from being involved in the task of driving. Tesla stands out in the industry in its approach to L2 technology due to the discrepancy between the weak driver involvement and the permitted operating capabilities of Autopilot. Even the brand name “Autopilot” is misleading to consumers. Tesla products make drivers think they are more capable and efficient than they actually are. Other manufacturers use words like “help”.

UX/UI designer course sites and sites z Alice K. The course is for a practicing UI/UX designer, so you know everything about UI/UX design. Registration for the course

NHTSA concludes that drivers using Autopilo or the more advanced system, Full Self-Driving, “were not sufficiently engaged in the driving task” and Tesla's technology “did not adequately ensure drivers were paying attention on the control task.”

NHTSA admits its study may be incomplete due to “gaps” in Tesla telemetry data. This may mean that there are many more accidents involving Autopilot and FSD than the Administration was able to detect.

Tesla Autopilot и Full Self-Driving причастны к сотням аварий и десяткам смертельных случаев — NHTSA

Tesla Autopilot и Full Self-Driving причастны к сотням аварий и десяткам смертельных случаев — NHTSA

Favbet Tech is IT a company with 100% Ukrainian DNA, which creates perfect services for iGaming and Betting using advanced technologies and provides access to them. Favbet Tech develops innovative software through a complex multi-component platform that can withstand enormous loads and create a unique experience for players. The IT company is part of the FAVBET group of companies.

You may also like

Leave a Comment