BOOK THIS SPACE FOR AD
ARTICLE ADUS auto safety regulators on Wednesday said they had identified a 12th crash involving Tesla vehicles using advanced driver assistance systems in incidents involving emergency vehicles and demanded the automaker answer detailed questions about its Autopilot system.
The National Highway Traffic Safety Administration (NHTSA) on August 16 said it had opened a formal safety probe into Tesla driver assistance system Autopilot after 11 crashes. The probe covers 765,000 US Tesla vehicles built between 2014 and 2021.
The 12th occurred in Orlando on Saturday, NHTSA said. The agency sent Tesla a detailed 11-page letter on Tuesday with numerous questions it must answer, as part of its investigation.
Tesla's Autopilot handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods. Tesla says Autopilot enables vehicles to steer, accelerate and brake automatically within their lane.
Tesla did not respond to a request seeking comment. The company could face civil penalties of up to $115 million (roughly Rs. 840 crores) if it fails to fully respond to the questions, NHTSA said.
Tesla shares closed down 0.2 percent at $734.09 (roughly Rs. 53,630 crores) on Wednesday.
On Saturday, the Florida Highway Patrol said the car of a Florida trooper who had stopped to assist a disabled motorist on a major highway was struck by a Tesla that the driver said was in Autopilot mode. According to a police report released on Wednesday, the trooper "narrowly missed being struck as he was outside of his patrol car."
NHTSA said earlier it had reports of 17 injuries and one death in the 11 crashes. A December 2019 crash of a Tesla Model 3 left a passenger dead after the vehicle collided with a parked fire truck in Indiana.
NHTSA's request for information asks Tesla to detail how it detects and responds to emergency vehicles, as well as flashing lights, road flares, cones and barrels and to detail the impact of low light conditions.
NHTSA said previously that most of the 11 incidents occurred after dark.
Tesla in July introduced an option for some customers to subscribe to its advanced driver assistance software, dubbed Full Self-Driving capability. Tesla said the current features "do not make the vehicle autonomous."
NHTSA is seeking information on the "date and mileage at which the Full Self Driving (FSD) option was enabled" for all vehicles, along with all consumer complaints, field reports, crash reports and lawsuits.
The agency also wants Tesla to explain how it prevents use of the system outside areas where it is intended.
Among the detailed questions, NHTSA also asked Tesla to explain "testing and validation required prior to the release of the subject system or an in-field update to the subject system, including hardware and software components of such systems."
Tesla must respond to NHTSA's questions by October 22, it said, and it must disclose plans for any changes to Autopilot within the next 120 days.