Driver monitoring system (DMS) technology, seemingly relegated to the back burner, is sneaking back into safety discussions.
The growing awareness that DMS can improve automotive safety is driving new regulations and safety ratings for new cars. The European Parliament has updated its General Safety Regulation (GSR) for type-approval requirements, while the Euro NCAP (new car assessment program) is completing its DMS test protocols. Delayed two years, the NCAP five-star crash rating program will start testing in 2024.
That’s not all, though.
The automotive industry faces sobering evidence that the longer a driver uses the partial automation, the more careless he or she gets.
Certainly, automakers want drivers to rely on the new bells and whistles they tout as safety features in advanced driver-assistance systems (ADAS). However, drivers’ over-reliance on automation is breeding unsafe behavior. DMS sits smack-dab in the middle of the precarious man/machine relationship.
A looming question is whether DMS can mediate this troubled marriage.
In a recent EE Times interview, Bryan Reimer, research scientist at MIT’s AgeLab, cited his long-standing advocacy for “the need to fuse DMS data with active safety systems.” Reimer was an architect of the research by the Insurance Institute for Highway Safety (IIHS) that showed the adverse impact of the partial automation deployed in Level 1 and 2 cars.
The goal, noted Reimer, is “to maximize safety in a way that leverages advantages of both the automation and human.” To ensure that the joint system [vehicle and human] is better than a human-only system, “One does need to think about developing strong data.”
“ADAS and DMS must be connected,” stressed Colin Barnden, Semicast Research’s lead analyst. What DMS is detecting in the driver must translate into such ADAS actions as braking and steering, Barnden noted.
“Integration of internal and external sensing provides new opportunities for safety enhancement,” acknowledged Mike Lenné, senior vice president for human factors & aftermarket solutions at Seeing Machines. He cited “two areas of value” in connecting ADAS and DMS. “They include a) understanding the real-world safety risks to ensure the right human states are detected; and b) using DMS to threshold ADAS features.”
A driver might, for example, have his hands on the wheel, but is he zoning out?
Among the lingering challenges is “understanding how the [DMS] features should be adjusted to a range of driving scenarios/environments,” said Lenné. “For example, how do you threshold distraction signals for urban vs. rural driving?”
Add to this the issue of carmakers “having the courage to make the HMI / vehicle response appropriate for the severity of risk,” said Lenné. “The severity of the safety risk should dictate the HMI and vehicle response.”
A ‘base version’ of DMS
We shouldn’t get ahead of ourselves. Euro NCAP, for example, is “taking a very pragmatic, quick approach,” noted Richard Schram, technical director at Euro NCAP, during an Affectiva-hosted web event about “Advanced Safety Beyond Driver Monitoring.” Schram stressed that Euro NCAP wants “a base version [of DMS] on every vehicle” first; a technology roadmap then has to follow. “If you look for the perfect [DMS] system, we will never get there.”
For this base system, are we talking about camera-only, vision-based DMS?
Given DMS test protocols under development at Euro NCAP, the focus is squarely on vision-based systems.
Martin Krantz, Smart Eye CEO, quipped, “Steering wheel torque sensors [used in Tesla, for example] are sometimes classified as DMS… but they are becoming unfashionable.”
Talking with EE Times, Jungo Connectivity CEO Ophir Herbst acknowledged, “DMS and occupant monitoring system (OMS) technologies are based on 2D cameras… We actually have some companies that work with us (and above our stack) to measure the ‘cognition’ or ‘state of mind’ of the driver. OEMs in general are looking at additional sensors, but they realize cameras provide the most ROI in terms of various features. Cost matters.”
Eyesight Technologies, which has recently renamed itself Cipia, remains adamant that vision is the leading method for driver monitoring.
Tal Krzypow, vice president of products, told EE Times, vision “can detect eye closure, gaze, and facial expressions (among other capabilities), which are the most direct indicators for a driver’s drowsiness and distraction.”
He added, “Especially for distraction, which is one of the leading causes for accidents, there’s no substitute for tracking the visual attention of the driver.”
Krzypow did not dismiss other sensors such as radar for monitoring respiration, but he stressed, “At the moment, the market is very much driven by regulatory requirements, which are best addressed by vision.”
What about sensors for the next-generation DMS?
Even though vision-based DMS is a priority for many OEMs, DMS developers won’t stop looking for sensors that could add new dimensions to what they detect.
Smart Eye CEO Krantz painted this broader picture. “At CES 2020, we presented a radar based breathing detector.” Like many DMS technology suppliers also looking for occupant monitoring solutions, Krantz said, “We are also pursuing complete in-cabin sensing opportunities where one or two wide angle cameras keep track of the whole cabin and measure for body posture, child left behind, seat belts etc.” He added, “We are also extracting facial expressions of all the passengers and mapping it to the basic emotions. The same is done in terms of identifying everyone in the car.” Krantz believes the car’s interior will be monitored “with a multi-modal sensor package.” Smart Eye plans “to continue to develop our software so that it handles multiple sensor modalities.”
Some tier ones such as Valeo are reportedly exploring radar for “living being detection” (i.e. babies, small dogs) left behind a vehicle.
Time-of-flight (ToF) sensors might also have a role in DMS. A good example is a partnership between ADI and Jungo announced earlier this year. Asked why, Jugo CEO Herbst explained that ToF sensors allow better accuracy for algorithms such as face recognition, gesture, and occupancy/pose. He added, “In some cases, ToF sensors can provide redundancy – such as detecting hands off wheel.” ToF sensors will become available in DMS “this upcoming year,” he noted.
Another school of thought doubts that adding sensory modalities in DMS will solve such thorny issues as detecting driver drowsiness. Calling drowsiness “the toughest nut to crack,” Seeing Machines’ Lenné said, “We looked at brain activity, heart rate activity, respiration.” Among physiological sensors studied were some developed and designed by academic institutions in Australia and Britain. “We found all of these other sensors incredibly noisy,” he noted. Also, “there are 101 reasons why your heart rate or respiration rate can change.”
With a DMS mandate approaching reality, Seeing Machines has reiterated its message that the “eyes have it” and “eyes don’t lie.” Lenné emphasized broad-based R&D findings that “if you really want to know what’s going on with someone’s attention and cognitive state, you’ve got to study their eyes and their face.”
What and how should Euro NCAP test?
The first priority in DMS testing is “to put the system in a car,” noted Smart Eye’s Krantz. “Many systems that work on the bench don’t work in the harsh environment of real car driving.”
Some real-world surprises that upset DMS technologies include “direct sunlight, streets lights, which create strobing effects,” noted Semicast’s Barnden. Drivers wearing masks and sunglasses could also confuse a DMS.
What then are “must-check” DMS features? Many suppliers declined to say, citing their involvement with Euro NCAP to develop tests. But getting down to fundamentals, Smart Eye’s CEO stressed the “availability of signals.”
“It’s important to keep track of both the accuracy and the availability of the fundamental signals, such as head pose, eye lid opening, gaze direction etc. If these signals are not of high integrity it will not be possible to build a warning application on top, which is both sensitive and has a low frequency of false warnings at the same time.”
Krantz added, “Glancing on the cluster should not be mixed up with a microsleep, for example.”
Describing Euro NCAP’s test protocols as a “work in progress,” Krantz said, “We are confident that it will reflect the need for being sufficiently advanced so that it will not miss out on measuring the most important features of a DMS system.”
Jungo’s CEO Herbst listed likely requirements:
- Identify head pose base distraction, e.g glimpsing a phone, adjusting infotainment
- Identify microsleep and other sleep events
- Identify long-term drowsiness via multiple signals (eyes, yawns, electroencephalogram, steering wheel)
- Various light conditions (day, night, direct sun)
- Varius accessories (glasses, sunglasses, hat, mask)
In the end, OEMs want NCAP-compliant systems to be cost effective, said Herbst: “DMS that can run on one of their existing compute options (infotainment, cluster).” Carmakers “will also look at availability of additional features such as software updates.” Further, tier ones will want a solution that easily fits their preferred chipsets, he added. “Regulators do not dictate technology. They will focus on use and edge cases.”
Is there a drowsy dummy?
This is an odd question, but it addresses how Euro NCAP could test DMS inside a vehicle. Is there such a thing as a drowsy dummy?
Seeing Machines’ Lenné said, “Our view is that a person/driver needs to be actively engaged with a system under test, either in a simulator or test track setting. How this will be implemented is not clear at this stage – but there will likely need to be some agreement on levels of drowsiness.”
On microsleep, Smart Eye’s Krantz said, “You should warn not only for microsleeps, which comes very late in the process of falling asleep, but also for severe drowsiness stages where the driving performance goes down drastically.” He suggested, “Put drivers through the same scenario (long boring drive during the wee hours) in both simulator and in real driving where they self-assess their drowsiness during regular time intervals. The difference is that in the simulator you can proceed until they actually fall asleep.”
In reality tests appear more complicated than Krantz and Lenné implied. As Cipia’s Krzypow noted, “Drowsiness (as opposed to eye closure) poses a double challenge: it is hard to measure and hard to simulate.” Nobody has yet explained how Euro NCAP plans to meet the challenge.
How can DMS stop annoying the driver?
With all said and done, the nagging weak link in DMS technology is that when drivers get annoyed by too many false positives, they turn off the DMS, rendering it useless.
Jungo’s CEO acknowledged, “Indeed too many false alarms would be annoying.” At Jungo, “We limit alerts to clear distraction events, such as not looking at the road, using phone, smoking and the like. We do not try to assess today the ‘cognitive’ state of the driver, such as looking at the road but day-dreaming. This is indeed too error-prone.”
Smart Eye’s CEO Krantz said, “The key to minimize the error rate is to have very high precision in the fundamental signals.” He added, “There need to be well designed and thoroughly tested applications that translate the fundamental signals into valuable driver state information and finally a well-rounded warning strategy, that informs the drivers without annoying them.”
Seeing Machines’ Lenné agreed that driver acceptance is critical. “Designing for the human is so important. You need to understand the behavior and associated risk, such as distraction, and build that understanding into the features so that it ‘makes sense’ to a driver.” He went on, “The amount of false positive alerts depends on other factors including the reliability of signal availability, which is also key.”
MIT’s Reimer, stressing problems that overly active driver-alert systems could present to consumers, noted a General Motors example.
“GM seems to have developed a multi-stage alerting approach with Super Cruise that addresses this through a combination of reinforcement (silent cues) and escalating alerts.”
In theory, recognizing the driver’s state (“he is paying attention”) and connecting it to the DMS alert system can mitigate the annoyance factor.
But Reimer said, “I am not aware of any systems that are addressing cognitive absorption or mind wandering issues. This is an active area of research in our group as well as several OEMs and suppliers we work with.”
DMS on the road today
EE Times undertook this article seeking to determine the mindset among leading DMS suppliers, and learning about their engineering efforts to develop safer driver monitoring systems.
But in the real world, as Semicast Research’s Barnden often says, “It’s incredibly easy to do driver monitoring so poorly.”
Consider this promotional video clip below about Hyundai Genesis’ Forward Attention Warning (FAW) system.
The narrator warns about “some situations in which FAW may not operate properly… for instance, if the driver is wearing polarized lenses, heavy eye makeup, or has their face obscured by their hair or a hat, or if the view is blocked by the position of the steering wheel.”
He concludes: “Remember, the Forward Attention Warning system does not substitute for proper and safe driving. It’s the responsibility of the driver to always check the speed and distance to the vehicle ahead.”
So, although the system fails to fully monitor the driver, the carmaker can tick off the DMS box, save R&D expenses and pass the buck back to the lady in the car with glasses and a hat.