Despite that acknowledgment by the company, as the federal agency pushes for answers about the accident and whether the Autopilot system failed to work properly, Tesla officials continue to say that the technology is safe.
Noting that its Autopilot must be activated by the driver before it can be used, Tesla's website advises that "the system is new technology and still in a public beta phase".
Tesla faces up to $21,000 per day, with a maximum of $105 million, if it fails to meet NHTSA's deadlines for responding to the questions about the autopilot crash.
Tesla has disputed allegations that its SEC filings since the autopilot crash have been misleading. Tesla also has been asked to outline the specific types of collision scenarios that its AEB system is created to prevent, along with any known limitations.
Though it was found out that the driver of the vehicle had been watching a "Harry Potter" film when the crash happened, many quickly blamed the carmaker for calling the assisted driving feature "Autopilot", though it is a feature that does not fully offer automated driving, as pointed out by Gizmodo.
Tesla is being investigated by the USA road safety watchdog into its deployment of the technology.
On Tuesday, the traffic safety body released a letter it had sent to Telsa detailing its own investigation into the accident. Under optimal conditions, this means that hands-free cruising will work on some sections of highway, but it's worlds apart from a proper self-driving auto.
Tesla Motors Inc.'s (NASDAQ:TSLA) first Autopilot fatality in Florida has led several regulatory authorities around the world to raise their eyebrows.
The driver, who wasn't identified by police, said the Model X veered off the right of the road and struck wooden posts holding a cable railing, Montana Highway Patrol Trooper Jade Shope said.
At least three recent accidents, one fatal, have involved drivers using Autopilot, though specifics of how much the drivers were still involved in piloting the cars at the time are not available. If there's no force on the wheel or a sharp turn is detected, the vehicle is programmed to gradually reduce speed, stop and turn on the emergency lights, Tesla said in a statement. "As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel", said Tesla. "He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway". No citation was issued to the drivers because the trooper believed any citation would be void if the auto was operating on Autopilot as claimed by the driver.
Tesla, in response to the Fortune article, noted in a blog that the May 7 accident was reported to NHTSA on May 16 - when it had just started its investigation. If the probe finds defects with Tesla's system, the agency could seek a recall.
The reported SEC investigation is the latest in a string of worrisome developments - including two other Autopilot incidents - for the highly regarded automaker and its popular chief executive, Elon Musk.
It has asked for details about all updates made to autopilot since it was enabled a year ago, as well as information about forthcoming changes.