Tesla reins in Autopilot after 'fairly crazy' drivers misuse self-driving tech
Loading...
Self-driving cars are here. They鈥檙e not just being developed in labs or tested on closed courses 鈥 partially autonomous vehicles are cruising down public highways right at this moment, carrying people to and from work. And though the shift from cars driven by humans to cars driving humans has barely begun, it鈥檚 gone pretty smoothly so far.聽
During a quarterly financial call on Wednesday, Tesla chief executive officer Elon Musk said his company鈥檚 introduction of self-driving features to its Model S electric car has been mostly successful. The self-driving mode, known as Autopilot, was made available to Model S owners via an over-the-air software update last month, and is currently active in about 40,000 vehicles.
鈥淲e're very aware of many聽accidents that were prevented by Autopilot, and not aware of any that were caused by Autopilot,鈥 Mr. Musk told reporters on the call. (A video of one of those prevented accidents went viral last week: Hundreds of thousands of people watched of a Tesla automatically braking when a car veered into its lane near Seattle.)
Still, Musk said, the company is aware of some 鈥渇airly crazy videos鈥 of people using Autopilot irresponsibly, such as activating it on that the software isn鈥檛 yet ready to handle.
鈥淲e will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things,鈥 Musk said, CarThrottle. That probably means that Autopilot won鈥檛 be able to be activated if the driver doesn鈥檛 keep at least one hand lightly on the steering wheel. Tesla suggests that drivers keep their hands on the wheel at all times, but the Autopilot software doesn鈥檛 currently enforce that suggestion.
Musk conceded that Autopilot isn鈥檛 perfect right now, but said that it will get better over time.
鈥淚t was described as a beta release. The system will learn over time and get better and that鈥檚 exactly what it鈥檚 doing,鈥 he said on the call. 聽鈥淚t will start to feel quite refined within a couple of months.鈥
Most people are used to the open-beta model for software, in which a product ships in a not-quite-perfect state and is updated over time based on user feedback. (Gmail was famously left in 鈥渂eta鈥 stage for more than as developers refined the software based on how people were using it.) But this iterative approach has only recently been applied to autos, and Tesla and other automakers who offer partially autonomous features for their cars have to make sure the software is safe, even in its early stages.
So far, accidents involving self-driving cars have mostly been the fault of human drivers, which suggests that drivers can be reasonably confident about letting their cars take the wheel.