Tesla faces regulatory problems due to accidents with its driverless cars. Photo: The Washington Post.
The US government has begun an investigation of accidents involving Tesla autopilot cars, after an independent regulator repeatedly requested it while denouncing the automaker for allowing unproven technology to be used on public roads.
The National Highway Traffic Safety Administration (NHTSA), which is part of the Department of Transportation, announced Monday that it had opened an investigation into eleven Tesla car accidents in which there were it had required the presence of first aid.
The crashes, which left 17 injured and one fatality, all involved a vehicle that had the autopilot or traffic-conscious navigation control in place, and occurred in places where driver signage such as cones and arrows, it was clearly displayed, the agency reported.
The investigation means an intensification of the study of Tesla’s autopilot program, which had previously been led by the National Transportation Safety Board (NTSB), an independent entity. Unlike that organization, NHTSA has the power to force vehicles to be taken off the road if it finds fault, and to impose a regulatory framework for the entire industry.
The inquiry is dated Aug. 13 but was revealed Monday, and covers virtually all Tesla vehicles sold in the U.S. market in recent years, some 765,000 cars, according to NHTSA. The regulator reported that several Tesla models had “encountered scenes where first responders were present and had collided with one or more vehicles involved in the accident.”
Shares of the automaker fell more than 4 percent on news of the investigation.
China’s vehicle safety regulator reported in June that Tesla was voluntarily recalling 300,000 cars due to a glitch in the autopilot program, although Tesla was later able to fix the problem.
The investigation has resulted from fierce criticism of federal regulators by the NTSB for what it has called a “hands-free approach to regulation. [que] generates a potential risk to drivers and other road users ”.
Elon Musk, Tesla’s chief executive, has also come under fire for alleging that the program, an advanced driving assistant designed to perform road tasks such as maintaining speed or keeping the car in its lane, is capable of taking full control of the vehicle.
Earlier this year, NTSB President Robert Sumwalt said that Tesla had released the “beta version” of the program for testing on public roads without supervision.
Musk has faced criticism for his aggressive marketing of the Tesla program and the decision to name it Autopilot, even though in reality it is only a level two driver assistance system, much less than the maximum level five, in which a car you can drive yourself.
“NHTSA reminds the public that commercially there is no self-driving motor vehicle today,” the agency said Monday. “All available vehicles require a human driver to be in control at all times, and all state laws hold human drivers responsible for the operation of their vehicles.”
The federal investigation will evaluate both Tesla’s artificial intelligence technology and the systems used by the company to monitor drivers and make sure they are paying attention and taking full control of the car when necessary. In 2016, Tesla introduced additional warnings for drivers after what was the first fatal accident involving the use of the autopilot.
“To maintain the agency’s key safety mission and better understand the causes of some of Tesla’s accidents, NHTSA has begun a preliminary assessment of the Tesla Autopilot system and the technologies and methods used to monitor, assist, and comply with driver involvement when driving while Autopilot is in use, ”said NHTSA.
“Certain advanced driver assistance features can promote greater safety by helping drivers avoid collisions and mitigate the severity of collisions that are not avoidable, but like any other technology and equipment in a motor vehicle, drivers should use them. correctly and responsibly ”.
Missy Cummings, an engineering professor at Duke University, said the NHTSA assessment highlights a fundamental problem with Tesla’s technology. “The vehicle’s visual system is not capable of responding to anything out of the ordinary – and that by definition includes any emergency situation.”
Consumer groups that have criticized Tesla’s use of AI and called for more regulation responded favorably to the announcement of the investigation. However, Cummings said systems like Tesla’s judge based on probabilities, and therefore present a fundamental problem for regulators, who are trying to make sure the technology is safe.
The research covers most of the vehicles produced by Tesla since 2014, including its Y, X, S and 3 models.
Musk did not immediately respond to our request for information.
Copyright – The Financial Times Limited 2021
© 2021 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles that are later redistributed by email or posted on the web.
Read the original article here.
The Canadian News
Canada’s largets news curation site with over 20+ agency partners