News

Tesla Autopilot feature linked to hundreds of crashes and 14 deaths

The system was found to give drivers a false sense of security, according to US safety regulatorsTesla Autopilot feature linked to hundreds of crashes and 14 deaths

Tesla Autopilot feature linked to hundreds of crashes and 14 deaths

©  Getty Images / Constantinis

Tesla’s Autopilot, an advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars, has been linked to hundreds of crashes and over a dozen deaths in the latest report by US auto-safety regulators, published on Thursday.

The US Transport Department’s National Highway Traffic Safety Administration (NHTSA) said that their investigation into Tesla’s Autopilot had identified at least 14 fatal crashes in which the feature was involved.

During its three-year investigation, which started in 2021, the agency has examined nearly 1,000 reported crashes that occurred between 2018 and August 2023. It found that the misuse of the Autopilot system had caused at least 14 accidents which led to fatalities and “many more involving serious injuries.”

NHTSA’s Office of Defective Investigations (ODI) found evidence that Tesla’s “weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities,” which resulted in a “critical safety gap.”

Of the 956 crashes examined, officials revealed Autopilot-related trends in about half of them.

Of the remaining 467, ODI identified 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These accidents, which were often the most severe, had resulted in 14 deaths and 49 injuries. Over a hundred of the incidents also involved in roadway departures where Autosteer, a component of Autopilot, was “inadvertently disengaged by the driver’s inputs,” the report said.

Tesla signs deal with Indian conglomerate – media

Tesla signs deal with Indian conglomerate – media

READ MORE: Tesla signs deal with Indian conglomerate – media

The investigators concluded that drivers using Autopilot, or the system’s more advanced Full Self-Driving feature, “were not sufficiently engaged in the driving task.” Tesla’s technology “did not adequately ensure that drivers maintained their attention on the driving task,” the NHTSA said.

The investigation also found that the electric carmaker’s claims did not match up with reality.

The NHTSA raised concerns that Tesla’s Autopilot name “may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”

US safety authorities said on Friday that they have opened another investigation into Tesla’s largest-ever recall in December, covering over 2 million US vehicles, or essentially all of its vehicles on US roads.

The recall was ordered by the NHTSA over Tesla’s software update, which is designed to limit the use of its Autopilot feature. The company plans to unveil its robotaxi on August 8.

Source

Leave a Reply

Back to top button