Analysts speak up on Tesla AutoPilot incidents

Article By : Junko Yoshida

Tesla AutoPilot system requires drivers to stay alert despite being on AutoPilot mode. What happened in recent accidents? Experts speak up.

I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.

The user’s post on the web site’s forum read:

I was on the last day of my 7-day deposit period. I was really excited about the car. So I took my friend to a local Tesla store and we went for a drive. AP [AutoPilot] was engaged. As we went up a hill, the car was NOT slowing down approaching a red light at 50 mph. The salesperson suggested that my friend not brake, letting the system do the work. It didn't. The car in front of us had come to a complete stop. The salesperson then said, "brake<img alt="" Full braking didn't stop the car in time and we rear-ended the car in front of us HARD. All airbags deployed. The car was totaled. I have heard from a number of AP owners that there are limitations to the system (of course) but, wow The purpose of this post isn't to assign blame, but I mention this for the obvious reason that AP isn't autonomous and it makes sense to have new drivers use this system in very restricted circumstances before activating it in a busy urban area.

Thankfully, nobody got hurt. This post got no traction in the media. No reporter appears to be following it up (except for this publication). This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.

[Tesla crash cr]" src="https://images.contentful.com/7jb0g1eg08yi/2thnpwICzO8M4CGqUcIaA2/df17e1f226be4a6b4183d661df37ef1f/Tesla_crash_cr.jpg" />
Figure 1: Tesla Autopilot system has been in recent news due to user-related accidents.

However, this accident, and moreso, subsequent discussions in the Tesla Motors Club forum, intrigued me for two reasons.

First, it’s a reminder that it ain’t easy to curb drivers’ appetite to “test the limits” of so-called AutoPilot, despite the carmaker’s stern warnings.

The key case in point is Tesla’s first fatal accident, which took place in Florida last May. After splurging on such a well-regarded, expensive automobile, who wouldn’t want to test its limits in driving and brag about it? The inevitable result is a clash between Tesla’s prudence and human nature.

Second, AutoPilot is still in the making. New technologies keep emerging, allowing the automaker to continue to improve it via software updates.

I was amazed to see so many posts by other Tesla users — all apparently very knowledgeable. They discussed the limitations of radar, problems AutoPilot handling hills, and the differences between software versions 7.1 and 8.0.

If this isn’t “inside baseball,” what is? I’d hate to think that an average driver needs to do this much homework to really understand why AutoPilot just doesn’t work in certain situations and why it isn’t autonomous.

But first thing first. I had to find out if the accident described to the Tesla Motors Club actually happened. I had no corroboration, and I’m still too fussy to believe every blog I see.

It took Tesla a few days, but the company finally got back to me Thursday, with the following statement.

“After speaking with our customer and the Tesla employee, we determined that this accident was the result of a miscommunication between the individuals inside the car.”

The accident happened near Los Angeles area. The vehicle that crashed on a test drive was running the software version 8.0.

As expressed in the statement above, Tesla stressed that this accident was not the result of the driver intentionally testing the limits of AutoPilot, but it happened because a miscommunication inside the car.

Tesla's version 8 software, according to Tesla, offers "the most significant upgrade to Autopilot" among other things. It will use "more advanced signal processing to create a picture of the world using the onboard radar."

To be clear, the radar was already added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but it was only meant to be a supplementary sensor to the primary camera and image processing system, the company explained.

A the time of version 8.0 release, Tesla made it clear that it changed its mind. "We believe radar can be used as a primary control sensor without requiring the camera to confirm visual image recognition."

To Tesla’s credit, the company isn’t dismissing the incident offhand. The company is painfully aware — since the Florida crash — of the intense scrutiny now focused on Tesla’s AutoPilot, compared with other carmakers’ similar Level 2 systems.

[Autopilot screencap 1 cr]
*_Figure 2_: In Tesla’s Autopilot system, the Autosteer feature is enabled via the Driver Assistance tab in the vehicle settings.*

Tesla has been publicly insistent that the company has been “continuously” educating its customers when it comes to AutoPilot.

Tesla offers what it calls the “Autosteer feature,” which must be enabled through the “Driver Assistance” tab in vehicle settings, before the driver starts using AutoPilot. The user must agree to a dialogue box that describes the system as “Beta” and instructs the user to pay attention to the road at all times.

Every time the feature is subsequently activated, the user gets fresh reminders, both visual and audible, of Tesla’s liability agreement. Failure to comply results in a rapidly-escalating sequence of intrusive alerts, followed by the deactivation of Autosteer, under the new software version 8.0.

This sounds all good. But here’s a question. Are all these bells and whistles sufficient to thwart the cavalier driver willing to risk his own life (let alone others who share the same road)?

Level 2 automation

It’s important to note that Tesla calls the system an example of Level 2 automation, although it actually does a whole lot more. It combines adaptive cruise control, automatic steering, automatic lane changes, and automatic emergency steering. It does just about everything the driver would normally do under routine highway conditions.

On one hand, Tesla’s software requires drivers to agree that they’ll “remain engaged and aware” and keep their hands on the steering wheel while using AutoPilot. The company makes it clear that in the event of an accident, drivers are still responsible.

[Tesla Autosteer cr]
*_Figure 3_: Tesla’s Autopilot software requires drivers to agree that they remain engaged and keep hands on the steering wheel.*

On the other hand, the company never makes it clear — either explicitly or implicitly — that the car relies on a technology that will continue to improve, and at any given time, might not prove advanced enough to be fully trusted with people’s lives.

Tesla might be putting too much faith in human drivers, trusting them to understand the nuances of divergent underlying messages.

Tesla, however, maintains that the company has been thoroughly explicit that its technology gets better with time. Further, every time Tesla’s software is updated, customers receive release notes detailing exactly what has been updated in their cars. In the company’s view, Tesla firmly believes that over-the-air software upgrades is a selling point and a major advantage of Tesla ownership.

In short, from Tesla’s viewpoint, the limits of the system are clearly articulated throughout the user interface and owners’ manual. Tesla made it clear that in the case of this Tesla crash on a test drive, the car’s UI alerted the driver to brake and asked him to take control, but the driver failed do so because of the miscommunication inside the car.

What could have prevented this?

In this report, several automotive industry analysts were interviewed to see their takeaway from Tesla’s test-drive crash.

First, practically everyone I talked to squarely blamed the sales rep. Angelos Lakrintis, Strategy Analytics’ analyst, said, “Tesla’s sales rep was at fault here.”

He said, “The operator, driver, had time to stop the car but he didn’t because the sales representative wanted to showcase the traffic jam assist of the AutoPilot system.” He disregarded the fact that the first version of the AutoPilot is designed only for highway usage.

Mike Demler, senior analyst at the Linley Group, agreed. “They did this on a test drive? What salesperson would allow such a thing, let alone encourage it? I don’t know about Tesla, but when I test drove my BMW I was barely given 5 minutes to go up on a highway for one exit and back.”

OK. The dealer’s the villain who took a highway feature and put it on Maple Street.

But why did AutoPilot adjust so poorly? What — if anything — in Tesla’s technology failed to see imminent danger in this situation?

Phil Magney, founder and principal advisor of Vision Systems Intelligence, said, “The biggest problem seems to be stationary objects. Updates to version 1 (8) are supposed to address this problem with changes to the radar detection. Obviously, the camera did not work either in this case.”

Magney added, “The driver was going 50 MPH and you need a braking distance more than 50 meters at this pace. Radar may not have detected in time. Also, it was driving on a hill. Field of view may have been an issue as well for both the camera and the radar.”

Demler agreed. “Driving uphill could easily cause problems, because depending on the relative incline the mirror-mounted camera could overshoot other vehicles, and be looking just at the sky. The radar is in the grill, but it might also overshoot if the car ahead was on a more level portion of the hill.”

Lakrintis noted, “Lots of factors may have caused this.” He speculated that depending on the width of the car in front, the color of the car and the reflection of the surfaces due to the fact that the sun might have been in front of the windshield blocking the field of view of the camera.”

Tesla, however, contests the speculations above. The company told us that the AutoPilot operated exactly as designed in this situation by alerting the driver to brake and asking him to take control. Tesla says the driver failed to do so because of a miscommunication inside the car.

What could have helped prevent this crash?

Aside from updates to the AutoPilot V.1, Magney pointed out, “Enhanced capabilities such as drivable path and/or highly detailed maps may have helped since you could separate safe stationary objects from unsafe when they appear within the drivable path.” He added, “Although not used in Tesla’s case, an AI based solution could have prevented this (in theory) because the more you train the better you get with edge cases.” In both cases, Magney qualified, “This assumes the sensors are working.”

While noting that “AutoPilot is excellent at slow traffic jams in both highways and other roads,” Strategy Analytics’ Lakrintis suggested, “A good solution is for Tesla to try and use the AutoPilot only on geo-fenced areas such as highways and interstates. In that way the operator will not be able to engage the AutoPilot in other areas.”

Magney put in the last word. “This emphasizes the ‘education’ part of NHTSA’s new Highly Automated Vehicle guidelines. Users, dealers, warnings, owner’s manuals, marketing literature… All this needs to be addressed to better align people’s perception of what an L2 system is not!”

Leave a comment