Does Tesla's Autopilot feature mislead drivers?
Loading...
Tesla has been asked to brief the US Senate committee over auto safety issues about the May crash that killed the vehicle鈥檚 occupant while the car was on Autopilot mode.
Consumer safety watchdogs now say that Tesla may be moving too fast with the introduction of its Autopilot software.
Since the crash, Consumer Reports magazine has urged Tesla to , saying that marketing the mechanism as 鈥淎utopilot鈥 may lead drivers to believe that they do not have to retain control of the car, despite Tesla鈥檚 assertions to the contrary.
"We're deeply concerned that consumers are being sold a pile of promises about unproven technology,鈥 Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports said in the statement. "'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time."
The crash that led the US Senate to summon Tesla for testimony occurred on May 7. Driver Joshua Brown was killed when Autopilot failed to recognize a turning tractor trailer in its path.
Some analysts suggest that calls to disarm the Autopilot systems are premature, pending conclusive investigation into the crash.
"Despite an ," writes tech journalist Yoni Heisler for BGR, a consumer electronics publication,聽"it鈥檚 far too soon to say with any certainty that Tesla鈥檚 Autopilot software has been the direct cause of any specific crash."
In a blog post about the accident, Tesla wrote, 鈥 the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,鈥 leading to Mr. Brown鈥檚 death.
Autopilot uses several tools, including cameras and radar, to survey the vehicle鈥檚 surroundings and perform functions such as braking at stoplights and changing lanes on the highway. Yet, despite its name, Tesla says that the mechanism is intended to reduce the burden of driving on the vehicle鈥檚 operator, not remove it completely.
Several consumer watchdogs, however, say that the name creates a false sense of security, and that Tesla should scale back its Autopilot program before more people get hurt. Consumer Reports asked Tesla to change the name of the program, which Tesla says is still in public beta testing.
Two other recent crashes have increased public scrutiny of Autopilot. Most recently, a Pennsylvania crash involving an Autopilot enabled vehicle prompted a US National Highway Traffic Safety Administration (NHTSA) investigation.
Tesla's chief executive officer, Elon Musk, wrote in a tweet that while the driver in the Pennsylvania crash claimed Autopilot was turned on, investigation showed that it was not. And if it had been turned on, Mr. Musk says, the car would not have crashed.
A Montana crash on Sunday, however, occurred while the Autopilot feature was turned on. The driver鈥檚 hands were not on the wheel.
This week, NHTSA announced that it wanted records of how often Autopilot drivers were told to keep their hands on the wheel.
Consumer Reports also says that Tesla should never have allowed consumers to purchase vehicles with a feature that, as the company itself says, is not yet out of beta testing.
鈥淐onsumers should never be guinea pigs for vehicle safety 'beta' programs,鈥 said Ms. MacCleery. 鈥淎t the same time, regulators urgently need to step up their oversight of cars with these active safety features. NHTSA should insist on expert, independent third-party testing and certification for these features, and issue mandatory safety standards to ensure that they operate safely."
Tesla announced that while the company appreciates such as Consumer Reports and Computerworld, it will make decisions about the future of its products, including Autopilot, based on 鈥渞eal-world data.鈥