A U.S. investigation is examining whether Tesla’s Autopilot recall sufficiently encouraged drivers to remain attentive
The U.S. National Highway Traffic Safety Administration (NHTSA) is investigating whether Tesla’s recall of its Autopilot driving system in the previous year was effective in ensuring that drivers remain attentive while on the road.
According to documents published on its website on Friday, Tesla has reported an additional 20 crashes involving Autopilot since the recall. These incidents, along with tests conducted by the agency, have raised concerns about the efficacy of the recall. The recall affected over 2 million vehicles, nearly all of the vehicles Tesla had sold at that time.
The NHTSA initiated the recall following a two-year investigation into Autopilot’s driver monitoring system, which detects torque on the steering wheel from the driver’s hands. The investigation was prompted by several cases of Teslas on Autopilot colliding with emergency vehicles parked on freeways.
The remedy for the recall involves an online software update to enhance warnings to drivers. However, the agency noted in its documents that there have been instances of crashes after the update, and Tesla attempted to address these issues with additional software updates following the initial recall fix. It is unclear if these subsequent updates were effective.
The NHTSA stated that its investigation will examine why these updates were not included in the recall or deemed sufficient to address a defect that poses a significant safety risk.
Tesla was contacted early Friday for comment, but there was no immediate response.
NHTSA indicated that Tesla reported the 20 crashes involving vehicles that had received the recall software fix. The agency has mandated Tesla and other automakers to report crashes involving partially and fully automated driving systems. NHTSA stated it would assess the recall, including the effectiveness and scope of Autopilot’s controls to address misuse, confusion, and use in areas where the system is not intended to function.
Tesla has reportedly allowed owners to choose whether to opt in to certain parts of the recall remedy, with the option to reverse some aspects.
Safety advocates have expressed ongoing concerns that Autopilot, which can maintain a vehicle in its lane and a safe distance from objects ahead, was not designed for use on roads other than limited-access highways.
The investigation follows a recent incident in which a Tesla, potentially operating on Autopilot, struck and killed a motorcyclist near Seattle. This has raised questions about whether the recall adequately ensures that Tesla drivers using Autopilot remain attentive.
After the April 19 crash in a suburban area about 15 miles northeast of Seattle, the driver of a 2022 Tesla Model S, who was using Autopilot, stated to a Washington State Patrol trooper that he looked at his cellphone while the vehicle was in motion. The driver, a 56-year-old, was arrested for investigation of vehicular homicide.
Authorities have not yet independently confirmed if Autopilot was active at the time of the crash.
On Thursday, NHTSA concluded its investigation of Autopilot, citing the recall and its effectiveness probe. The agency noted that Tesla’s driver engagement system was insufficient for Autopilot’s permissive operating capabilities, leading to the recall agreed to by Tesla last year after NHTSA found the driver monitoring system to be defective.
The system alerts drivers if it fails to detect torque from their hands on the steering wheel, a method experts consider ineffective. While many newer Teslas have cameras that can monitor the driver, they are not effective at night, and independent tests show that Autopilot can still be used even if the cameras are covered.
Shortly after the recall, The Associated Press reported that experts suggested the fix relied on technology that might not be effective. Research by NHTSA, the National Transportation Safety Board, and other investigators indicates that merely measuring torque on the steering wheel does not ensure that drivers are paying enough attention. Experts argue that night-vision cameras are necessary to monitor drivers’ eyes and ensure they are focused on the road.
Michael Brooks, executive director of the Center for Auto Safety, stated that NHTSA is examining where Tesla permits the use of Autopilot. The company does not restrict its use, even though it was designed for limited access freeways. Brooks suggested that Tesla relies on computers to determine whether Autopilot can be used rather than on maps showing the vehicle’s location.
Brooks questioned why Autopilot is still allowed to engage in areas where it wasn’t designed to operate if the car knows it’s in such an area. He suggested that NHTSA could seek civil fines and additional fixes from Tesla.
Government filings by Tesla regarding the December recall state that the online software update will increase warnings and alerts to drivers to keep their hands on the steering wheel.
NHTSA initiated its Autopilot crash investigation in 2021 after receiving 11 reports of Teslas using Autopilot striking parked emergency vehicles. In documents explaining the end of the investigation, NHTSA stated that it identified 467 crashes involving Autopilot, resulting in 54 injuries and 14 deaths.
Tesla offers two partially automated systems, Autopilot and a more advanced “Full Self-Driving,” but the company asserts that neither can drive independently despite their names. NHTSA’s investigative documents indicate 75 crashes and one death involving “Full Self-Driving,” though it is unclear if the system was at fault.
CEO Elon Musk has long claimed that “Full Self-Driving” will enable a fleet of robotaxis to generate income for the company and owners by utilizing the electric vehicles when they would otherwise be parked. Musk has touted self-driving vehicles as a growth driver for Tesla since the introduction of “Full Self-Driving” hardware in late 2015. The system is being road-tested by thousands of owners.
In 2019, Musk promised a fleet of autonomous robotaxis by 2020 that would appreciate in value, but this has not materialized, and prices have decreased as the autonomous robotaxis have been delayed year after year while being tested by owners to gather road data for Tesla’s computers.
Tesla emphasizes that neither system can operate independently, and drivers must be prepared to take control at any time. During a recent earnings conference call, neither Musk nor other Tesla executives specified when they expect Tesla vehicles to drive as well as humans. Musk expressed confidence in surpassing human driving reliability soon and suggested that those who doubt Tesla’s ability to solve autonomy should not invest in the company.