When they develop cars in their pursuit of automotive autonomy, engineers guide their work based on SAE’s automated-driving Levels described in the J3016 standard. But where is the manual, or a user’s guide, that consumers can study before they make a purchasing decision and understand their responsibilities for different types of ADAS and AV models?
There’s no such book.
The auto and tech industries default to using SAE Levels when they communicate the features and capabilities of automated vehicles to the media and consumers.
But here’s the thing. Marketers can’t resist doing what marketing has taught them to do: twist, bend or blur the intended meaning of SAE Levels to create narratives that suit their purposes. Consumers are left to their own devices to interpret each SAE Level — or what they think Tesla’s Autopilot can do — and to test cars’ road savvy.
How else would you explain something like this?
Some automotive engineers and industry experts, however, have been struggling with how to explain Level 3. At Level 3, when a vehicle requests help, a human driver must take over. The pitfalls of this machine-to-human handover are well documented in human factor science.
In this light, Egil Juliussen, an independent auto industry analyst, now a columnist (Egil’s Eye) for EE Times, flatly told us, “I think we should skip L3 altogether.”
It turns out that SAE itself has been revising its levels yet again — this would be the third time — according to a source close to SAE. Members of the group already discussed and quietly approved clarifications on Level 3 that have yet to be published.
A User’s Guide to Vehicle Autonomation Modes
Phil Koopman, co-founder of Edge Case Research and professor at Carnegie Mellon University, called SAE J3016 an “engineering standard by engineers and for engineers.” While SAE Levels serve the engineering community, Koopman suggested in a newly posted blog that it’s time to examine the “Levels” from an entirely different perspective.
How about creating categories for consumers, for a change, and articulate the role and responsibilities for occupants of a self-driving car?
During an interview with EE Times, Koopman said, “I think we may be better off, with a different framework for just ordinary audiences.”
In his proposed “User’s Guide to Vehicle Automation Modes,” Koopman created four categories of vehicle operation: Assistive, Supervised, Automated, and Autonomous.
A licensed human driver drives, and the vehicle assists.
The technology’s job is to help the driver do better by improving the vehicle’s ability to execute the driver’s commands and reduce the severity of any impending crash. Generally this maps to SAE Level 1 and some portions of SAE Level 2.
The vehicle drives, but a human driver is responsible for ensuring safety.
An effective driver monitoring system is required to ensure driver ability to take over when required. Tesla “Autopilot” and GM Super Cruise are examples of Supervised operating modes. Generally this maps to SAE Levels 2 and 3.
The vehicle performs the complete driving task.
In this “automated” operation mode the vehicle does the driving, but a responsible human is still the “captain of the ship” for handling everything except the driving. Generally this maps to SAE Levels 4 & 5.
The whole vehicle is completely capable of operation with no human monitoring.
If something goes wrong, the vehicle is entirely responsible for alerting humans that it needs assistance, and for operating safely until that assistance is available.
Achieving safety will depend on the autonomous vehicle being able to handle everything that comes its way, for example according to the UL 4600 safety standard. Generally this maps to SAE Levels 4 & 5.
Why new guidelines?
While stressing that the proposed User’s Guide is not meant to replace the engineering use of SAE Levels, Koopman made the case for new guidelines on vehicle automation modes because the industry itself has been confusing the market by using terms like “Level 2+” or “Level 3+”, neither defined by the SAE.
The Guide isn’t a straight revision of the SAE Levels. Koopman added three unique facets to the new guideline.
First, rather than discussing features and underlying technologies for each category, Koopman focused on the driver’s role and responsibility in overall vehicle operation. “The emphasis should be around the user,” Koopman said, instead of designing categories “around the technical stuff.”
Second, the user’s guide addresses gaps in the SAE Levels by considering “the safety relevant tasks a human driver does beyond actual driving.”
For example, can a self-driving car make sure a kid in the back seat is strapped with a seat belt? If a driverless EV’s batteries happen to catch fire, does a passenger know how to respond? A mature adult with a driver’s license might be able to deal with such issues, but a kid shoved into a driverless car by her mother to go to her piano lesson might be ill-equipped in such an emergency.
These are scenarios “commonly overlooked,” said Koopman, “but they happen.”
He said, “I am fine with the SAE standard not including them because they are out of its scope. But I am not fine if you call your vehicle ‘Level 5’ just because it does everything the SAE standard says it should do.” There’s more to safety, and your AV must have a plan, he noted.
Leveraging his hallmark #didyouthinkofthat questioning, Koopman argues that categories for vehicle automation modes must include “who handles non-driving safety issues.”
Third, the new Guide broaches liability issues, which are largely absent from SAE Levels. Koopman wrote that the Guide’s four rubrics “provide a more straightforward way to describe potential human driver liability”:
- Assistive: as with conventional human driving.
- Supervised: the human driver is responsible for safety unless the vehicle launches wild behavior beyond the ability of a reasonable human driver to intervene.
- Automated: the human driver is not responsible for driving errors but is responsible for non-driving aspects of safety such as passenger behavior, proper cargo loading, post-crash management, and so on.
- Autonomous: there is no human driver to blame for mistakes.
EE Times found the addition of liability issues to the Guide’s categories the most important and useful. When we asked Koopman, he cautioned: “I prefer to pair that with drivers’ responsibility. Failure to carry out your responsibility leads to liability.” He added, “I don’t want to make this about blame. I wanted to make it clear, this is about you. [The Guide says] you’re in a car, here’s your job, and here’s what you have to do to keep everyone safe.”
After all, “if you are confused about your responsibility,” said Koopman, “It’s hard to do the right thing.”
The User’s Guide also touches on the issue commonly known in the aviation world as “mode confusion.” Koopman noted in his blog that “a single vehicle [can] employ various operational modes across its Operational Design Domain (ODD).”
What’s important, he says, is that at “any particular time the vehicle and the driver both understand that the vehicle is in exactly one of the four operational modes so that the driver’s responsibilities remain clear.”
Here’s a simplified example for multiple mode operations: The same car might operate as “Autonomous” in a specially equipped parking garage. It can go “Automated” on limited access highways, “Supervised” on designated main roads, and “Assistive” at other times. Koopman stressed, “In such a car it would be important to ensure that the human driver is aware of and capable of performing accompanying driver responsibilities when modes change.”
Koopman wrote in his blog:
We think it would benefit consumers and other stakeholders if discussions regarding vehicle automation capabilities encompassed a driver point of view using the terms: Assistive, Supervised, Automated, and Autonomous. That could help reduce confusion and even loss of life caused by misunderstanding the responsibilities of the driver in different operational modes.
During the interview, Koopman told EE Times, “Mode confusions can be a big deal.” He explained, “It’s really important for the driver to know what mode the vehicle is in. And it must be right in your face” stipulating that this is your mode, this is your responsibility. A little indicator wouldn’t cut it.
The car should also watch the driver. When it tells the driver that it’s going to drop out of fully automated mode, it must track the driver’s reactions. Assume there is a ten-second warning. The car waits to see if you’re back. If you miss the ten-second deadline, the car must say, “OK. We’re going to pull over and wait for you to get it together.”
Unacceptable is a vehicle that just dumps a problem onto a driver’s lap, said Koopman.
The market reaction?
Although it’s too early to know how the industry would respond to the new proposal, Koopman told EE Times, “I think this helps the industry. Because if the [automotive] industry wants to build trust, it is important to communicate the driver’s responsibility clearly.”
While Koopman declined to name names, many players in the AV industry are aware that “there are certain companies who want to overstate the capabilities of their cars, and there are other companies who are unhappy with that,” he observed.
In his view, there should be a clear dividing line. “If you’re going to blame the human, it’s clearly [when a driver is driving a car in] ‘supervised’ mode, but not ‘automated.’” He added, “And if it’s ‘automated,’ you’re not allowed to blame the human.” The User’s Guide gives a “crystal clear cutting line.”
The upshot is, “If the companies really want to build something ‘automated,’ they can now claim a distinction without getting lost in the murk of the Level 3” debate, said Koopman.
Juliussen told EE Times that the User’s Guide makes sense. One caveat, however, is that the terms “automated” and “autonomous” sound too close for ordinary consumers to really understand the differences, he said. “But I can’t come up with better terminologies either.”
Level designations to certain vehicles have been controversial. “Carmakers can have a system with a L4, L5 capability while keeping a driver behind the steering wheel. In that case, the car isn’t really L4 or L5. It would be essentially a L2 vehicle,” argued Amnon Shashua, Mobileye’s CEO, during the recent CES presentation.
Meanwhile, Colin Barnden, lead analyst at Semicast Research, is on the record in a recent EE Times on Air podcast interview, saying, “The Society of Automotive Engineers should withdraw its levels of driving automation as described in J-3016.” He explained that J-3016 has provided the cover for the race to Level 5. “It doesn’t say anything about liability. It doesn’t really address the issues around safety. Particularly with liability, I’m now seeing the real world moving towards assisted driving where the human is liable at all times, and autonomous driving where the machine driver, the robodriver.”
The interview with Colin Barnden starts at 16:40.