In an ongoing saga surrounding Tesla’s Autopilot system, the electric vehicle manufacturer has reportedly made a request to the National Highway Traffic Safety Administration (NHTSA) to redact information regarding the usage of driver-assistance software during vehicular crashes. This revelation, as reported by The New Yorker, adds yet another layer of complexity to an already contentious relationship between Tesla, its advanced driving systems, and federal regulators.
According to an NHTSA spokesperson, Tesla’s request for redaction was grounded in a claim that the data in question contained confidential business information. This assertion was made under the protection of the Vehicle Safety Act, which legally obligates the NHTSA to treat such claims as confidential unless a formal legal process is initiated to challenge them. As of the time of reporting, Tesla had not responded to inquiries from Insider or The New Yorker regarding this matter.
This request for redaction comes amid a backdrop of numerous controversies that have surrounded Tesla’s Autopilot and Full Self-Driving systems in recent years. The Autopilot system, designed to assist drivers primarily on highways, is a standard feature in all Tesla vehicles. In contrast, Full Self-Driving is a beta add-on that commands a substantial $15,000 annual fee. Tesla markets this feature as more advanced, enabling vehicles to change lanes, recognize traffic signs and signals, and park autonomously. However, Tesla explicitly states that a “fully attentive driver” should remain in control when either of these features is engaged.
Concerns about the safety of these systems have grown. In June, The Washington Post reported a total of 736 crashes involving Teslas in Autopilot mode since 2019, resulting in 17 fatalities. The former deputy administrator of the NHTSA, Steven Cliff, disclosed that data indicated Tesla vehicles were involved in a disproportionately high number of accidents involving emergency vehicles, although it remained inconclusive whether human drivers or Tesla’s software bore responsibility.
The NHTSA initiated a probe into the role of Tesla’s Autopilot in 30 crashes, resulting in 10 fatalities between 2016 and 2021. Subsequently, another investigation was launched into 11 crashes, occurring since 2018, where Teslas collided with vehicles at first-responder sites, causing injuries and fatalities. These investigations encompassed a wide range of Tesla models, spanning from 2014 to 2021, totaling approximately 830,000 vehicles, according to an updated statement from the NHTSA in June 2022.
Further complicating matters, a Department of Justice criminal investigation into Tesla’s Autopilot and Full Self-Driving features was confirmed by the company in February. The NHTSA confirmed that “multiple investigations remain open,” hinting at the complexity and gravity of the situation.
As the investigations into Tesla’s driver-assistance systems continue, the dynamics between regulators, the automaker, and the broader public remain highly scrutinized, reflecting the ongoing challenges and debates surrounding the development and deployment of advanced autonomous driving technologies.
Source: Business Insider