Tesla Goes On Offensive As NHTSA Wants Answers

Even as the National Highway Traffic Safety Administration yesterday posted a nine-page letter requiring Tesla Motors to provide detailed information about the Autopilot system implicated in a fatal collision in May, executives including CEO Elon Musk mounted an offense, claiming that the technology is misunderstood by drivers and will save lives.

“Musk, in an interview, said the company is planning an explanatory blog post that highlights how Autopilot works and what drivers are expected to do after they activate it. ‘A lot of people don’t understand what it is and how you turn it on,’ Mr. Musk said,” write Mike Ramsey, Mike Spector and Jonathan Back for the Wall Street Journal. The company “has no plans to disable” the feature, they report.

advertisement

advertisement

“Tesla’s co-founder pushed hard to launch the Autopilot feature as soon as possible because ‘we knew we had a system that on balance would save lives,’ they continue. “While many auto makers offer systems that rely on automatic braking, steering assist or adaptive cruise control to aid drivers, Tesla’s system steers the car more actively than similar systems and the company has marketed it more aggressively.”

In the New York Times, an executive who was authorized to speak only on condition of anonymity “said drivers needed to be aware of road conditions and be able to take control of the car at a moment’s notice — even though he said Autopilot’s self-steering and speed controls could operate for up to three minutes without any driver involvement,” write Bill Vlasic and Neal E. Boudette.

“With any driver assistance system, a lack of customer education is a very real risk,” the executive told them.

The NHTSA letter, which is dated July 8 and was sent to Matthew Schwall, Tesla’s director of field performance engineering, seeks information from Tesla about Autopilot and why it failed to detect a tractor-trailer that crossed in front of a Model S sedan May 7 in Williston, Fla., that killed former Navy Seal Joshua Brown, 40, of Canton, Ohio, according to the Associated Press.

“Much of the letter seeks information on how the system works at intersections with crossing traffic, but it also asks Tesla to describe how the system detects ‘compromised or degraded’ signals from cameras and other sensors and how such problems are communicated to drivers,” report the AP’s Matt Volz and Tom Krisher.

Meanwhile, a Model X owner whose car swerved off a Montana road and hit a post over the weekend tells CNN Money that he would buy another Tesla. The driver, who would only identify himself by his last name — Pang — said he does not know if the accident was the car's fault or his and is “eager to talk to Tesla and learn why the car swerved off a narrow Montana road,” Chris Isidore and Gwen Sung report.

Tesla, in a statement, basically blamed the Mandarin-speaking driver who, it said, did not comply with alerts in English to put his hands on the wheel. He was given a ticket for reckless driving.

“The accident is the third serious crash apparently tied to the self-driving feature. That's calling the safety of such automatic driving features into question, just as they're being incorporated into more and more cars on the road,” Isidore and Sung write.

Meanwhile, RBC Capital Markets’ Joseph Spak and colleagues “argue that … there’s a ‘massive public perception vs. reality issue that must be overcome,’” Barron’s Ben Levisohn reports

“It’s an easy story to write, have a ‘hot take’ and grab a lot of eyeballs. The death is clearly unfortunate and tragic — we don’t mean to belittle that," Spak says. "And we don’t doubt that there could be some increased scrutiny regarding Tesla’s Autopilot system as well as increased public skepticism (and negative sentiment) on the progression of autonomous driving. But as the disruptive change that is autonomous driving becomes more and more of a possibility, there is a massive public perception vs. reality issue that must be overcome.” 

Bottom line: “the technology is net beneficial to society.”

As compelling a selling point as that may be, it’s going to take more than a few blog posts to win over regulators and driver’s seat skeptics.

2 comments about "Tesla Goes On Offensive As NHTSA Wants Answers".
Check to receive email when comments are posted.
  1. Michael Strassman from Similarweb, July 13, 2016 at 10:03 a.m.

    I say this as someone who can be skeptical about technology. The bottom line is that technology will be held to a standard of near-perfection, regardless of how superior it may perform to humans. The truth is that people--consumers and regulators--are uncomfortable with giving up control of a car for emotional reasons. Someone losing their life due to their own mistake is tragic, but when a machine is to blame we think the person could have saved themselves and that death could have been averted. However, the reality is that gross negligence causes a lot of accidents that could be averted, either with today's technology or five years from now. Driverless cars will become viable, but people will never be entirely comfortable with even 10% of current highway fatalities because when someone you know dies there will always be the nagging feeling that they could have avoided the crash. Some day all cars will likely be networked and automated, and highway deaths will be reduced by 90+%, but we still won't like the idea of not being in control, myself included. However, I will enjoy my commute spent reading the paper and drinking a cup of coffee instead of hammering my steering wheel caught in traffic.

  2. Tom Muscarello from DePaul University, July 13, 2016 at 1:41 p.m.

    Here is an article from May, 2014, entitled "Lazy Humans Shaped Google's New Autonomous Car".  https://www.technologyreview.com/s/527756/lazy-humans-shaped-googles-new-autonomous-car/

    Essentially, Google removed the human co-pilot because of human nature.  With use, the humans became overconfident in the autonomous ability and weren't as attentive.  Musk and company understand the technology but apparently not this human behavior.  With experience in the "autonomous" abilities of such a car people will get too complacent.

    That Tesla navscreen alert that appears whenever autonomous drive is activated is probably being read and processed in the same way that people deal with all the acceptance agreements on software and websites.   Drivers are supposed to take control when alerted of a hazard.  The only way that really works is if they are in full control when the emergency arises. How quickly can you refocus on the road and the looming emergency at 70 mph when your eyes, and brain, are focused on a paper or video?  Not fast enough.  Google learned this without the accidents at speed.

    Tesla could also learn from the misfortune of other manufacturers. Remember the Toyota "unintended acceleration" issue of a few years back? It was the result of a few problems that involved human, not technology mistakes.  One problem was pedal misapplication, usually involving an elderly driver. Stick shift drivers rarely have such problems because they know how to use both brake and accelerator pedals simultaneously with one foot and use the transmission to adjust engine and car speed and to help slow the car. It is a matter of being in control of the car in such a way that automatic transmissions don't require. The other problem was the use of a pile of floor mats or carpet remnants that weren't anchored and thus jammed under the accelerator pedal.  The floor mat problem would seem to be a matter of common sense, which apparently isn't that common.  

    It didn't matter that these problems were the result of driver errors.  Toyota had to adjust the pedals of all cars and instruct drivers on the appropriate way to install floor mats. And they were fined for what amounted to the incompetence of their customers. 


Next story loading loading..