Tesla faces new NHTSA probe into its ‘autopilot’ artificial intelligence system after three people died in an accident involving a Model S

Electric vehicle maker Tesla is under investigation by the U.S. government for its AI-powered autopilot system after three people were killed and three injured in a recent Model S crash in California this month. The United States Automobile Safety Agency said Wednesday it has launched an investigation into the fatal crash due to its improved driver assistance system.

An accident involving a 2022 Tesla Model S that hit Newport Beach construction equipment last week involved one of the 35 Tesla vehicles being investigated by the U.S. National Highway Traffic Safety Administration (NHTSA) for questioning advanced driver assistance systems such as autopilots. Being used since 2016.

A total of 14 accidental deaths, including three recent deaths, have been reported with this Tesla investigation. The NHTSA confirmed a new investigation into the May 12 Tesla Model S crash that killed three occupants and injured three workers when it hit construction equipment along the Pacific Coast Highway Newport Beach. Newport Beach police on Wednesday declined to say whether Tesla was in the autopilot at the time of the crash, saying an investigation was under way.

Tesla’s autopilot systems and other driver support systems that take on specific tasks from the driver’s seat are getting more attention. Tesla says on its website that Autopilot provides driver assistance by allowing vehicles to automatically steer, accelerate and brake, but requires active driver supervision and does not self-drive the vehicle. The NHTSA notes that there are no self-propelled vehicles on sale that do not allow drivers to exercise caution.

Last June, the NHTSA instructed automakers to report any accidents on public roads involving vehicles with fully autonomous or partially automated driver assistance systems. The partial automation system can center a vehicle in its lane and keep a safe distance from the vehicle in front. According to the NHTSA, data can show whether there are common patterns of crashes associated with these systems.

The NHTSA sends special teams each year to conduct more than 100 in-depth investigations into unique real-world crashes to quickly complete in-depth investigations that the automotive security community can use to improve the effectiveness of its systems. Improved security.

The NHTSA has conducted 35 special crash investigations on Tesla since 2016 involving improved driver assistance systems, in three cases the use of autopilots was ruled out. The NHTSA said separately on Wednesday that it had launched another special investigation into an accident involving a 2016 Tesla Model X in a collision in Florida in April that resulted in minor injuries, which may have involved the use of a system. Improved driving assistance.

In August, the NHTSA said it had begun a formal preliminary assessment of autopilot system faults and identified at least a dozen crashes involving Tesla models and emergency vehicles. This investigation is still ongoing.
The NHTSA is also investigating two accidents involving a Volvo, a new shuttle, two Cadillacs, a Lexus and a Hyundai. One of the accidents involving the Volvos was an Uber self-propelled test vehicle that hit and killed a pedestrian in Arizona in March 2018.

This week the investigation is a second push for Tesla, after security consultant Sultan Qasim Khan revealed that a hack could unlock Tesla Model 3 and Y cars and start the engine without having to physically enter. Khan said that it is possible to make the entrance system of the car tactical considering that the owner is near the car. It opens the door and turns on the engine, making it easier for a thief to escape from an expensive Tesla without a scratch on the paint.

Source: NHTSA (1, 2)

And you?

What is your opinion about it?
Do you think the names Autopilot and FSD confuse drivers? Should they, in your opinion, change?

See also:

Tesla tells U.S. lawmakers autopilots need ‘constant monitoring’, reinforcing criticism that the name misleads drivers

The Federal Road Safety Agency says Tesla will now have to report to the government autopilot-related crashes or be fined.

Tesla Autopilot: US Investigation Features After 11 Tesla Emergency Vehicles Crash

Thirty Tesla crashes involving assisted driving systems are being investigated in the United States, killing 10 people since 2016.

The US government has asked Tesla why people can play video games in a moving car, a feature under review.

Leave a Comment