Being right about something occasionally is fun, and gives you a little dopamine rush of pleasure, a pleasure that I personally encounter extremely rarely. I feel like I’m right about something today, so mark your calendars, but it’s not something I’m actually pleased to be right about: People both don’t understand and aren’t using Level 2 (L2) semi-automated driving assist systems well or safely. I’m being bold and saying I’m right about my thoughts on this – which I’ve been ranting about for quite a while – because a study from the Insurance Institute for Highway Safety (IIHS) just dropped, and it comes to essentially the same conclusions: Significant percentages of people who have cars with L2 semi-automated driver assist systems overestimate the capabilities of the systems, and as a result use them in potentially dangerous ways. With these systems becoming more and more common, it’s time we addressed this bluntly and honestly.
The study, called Habits, Attitudes, and Expectations of Regular Users of Partial Driving Automation Systems, is based on responses to questions from a pool of 604 respondents who own cars with one of the three most common L2 semi-automated driving systems, broken down like this: General Motors Super Cruise (200), Nissan/Infiniti ProPILOT Assist (202), and Tesla Autopilot (202).
What Level 2 Driving Automation Systems Actually Do
Now, before I get into the results of this study, let’s just make sure we’re all on the same page regarding exactly what each of these driver-assist semi-automated systems actually do. Fundamentally, they they all control speed and braking, dynamically keeping set distances from the car in front like dynamic cruise control, and they do some steering, keeping centered in a lane and following road curves and so on. Some systems do a good deal more, like Tesla’s Autopilot and GM’s SuperCruise, allowing for lane changes and some degree of navigating on streets other than highways, and so on.
What all these systems have in common is that they are very much not actually self-driving systems, and the person in the driver’s seat must remain alert and ready to take control at any moment, because any of these systems could disengage or make a poor decision at any time, with zero warning. This fact that the driver must remain on constant alert even when the semi-automated systems appear to be doing most of the work associated with the driving task, is at the heart of the problem that this study reveals, and, frustratingly, is something that researchers who study automation have known about since at least 1948.
The Problem
The problem even has a name, taken from N.H. Mackworth’s 1948 study called The Breakdown of Vigilance during Prolonged Visual Search: the Vigilance Problem.
Essentially, the vigilance problem is not a technological one, it’s a human one. Given a system that operates almost independently but requires passive oversight by a human, that human will not be good at paying attention to the system, and as a result will not likely be ready to take action when needed. In the case of these semi-automated driving systems, the situation is actually worse, because the naming and marketing and perception of these systems suggests more capabilities than they actually have, which makes people even less vigilant.
Nissan’s ProPILOT Assist is a bit of an exception to this, as its marketing and implementation do a lot less to suggest capability, and the numbers from this IIHS study back this up, as we’ll see soon.
What Drivers Think They Need To Do When Using Automated Driver Assist Systems
In fact, let’s look at some numbers right now, starting with this really basic question: Are drivers using these systems comfortable with not paying attention to the road while the systems are engaged? Here’s what the study found:
Over half of Super Cruise users (53%), 42% of Autopilot users, and only 12% of ProPILOT Assist users said they were comfortable letting the partial driving automation drive the vehicle without having to watch what was happening on the road.
Holy crap, over half of Super Cruise users didn’t think they needed to “watch what was happening on the road?” It’s worth noting that Super Cruise is the only one of these three systems that requires no physical contact with the steering wheel when engaged, instead relying on a camera-based system to track the driver’s eyes.
Tesla owners at 42% aren’t much better, but Nissan owners actually are, at only 12%.
The study explains a bit:
Over-trusting either hands-free (Schneider et al., 2022) or hands-on-wheel partial automation (Victor et al., 2018) can lead drivers to not intervene even when they see a hazardous situation forming in front of them because they incorrectly believe the system can handle more than it was designed to do. One question this study could not answer was whether hands-free driving capability is more likely to give users the impression that the system is more functionally capable and safer than a hands-on-wheel system. Nonetheless, more than half of Super Cruise users in the current study said they were comfortable letting the system drive itself without having to watch what was happening on the road, compared with approximately 40% of Autopilot users and only 12% of ProPILOT Assist users. Super Cruise users were also far more likely than the other two groups to say that most non-driving-related activities were safe to do while using the system.
What The IIHS Study Reveals Drivers Are Doing Instead Of Driving
So what the hell are these “non-driving-related-activities” that people are doing instead of paying attention to the road? Here’s what the study found:
Some of these seem like no big deal: eating and/or drinking is something we all have done while driving, same with talking to passengers or talking on a cellphone call via Bluetooth or looking at scenery. Most of those people have been doing since before cars even shifted themselves, let alone drove themselves in any sort of way.
Other activities on the chart a lot more alarming: per the survey, 49% of Super Cruise, 44% of Autopilot and 19% of ProPILOT users say they’ve been texting, which isn’t great. But 19% of Tesla Autopilot users have apparently used a laptop or tablet? And 20% watch videos? 18% read a book? What the hell? And 10% of Autopilot users admitted to sleeping? Sleeping!
The fuck is wrong with these people?
That chart notes the difference in what people do with the systems both on and off, and come to the very obvious conclusion that drivers are much more likely to do stupider things when their semi-automated systems are on:
The present study supports the finding of previous research that partial driving automation facilitates engagement in non-driving-related activities (e.g., Dunn et al., 2021; Noble et al., 2021; Reagan et al., 2021); however, the attitudes, expectations, and habits around individual activities and system design safeguards vary among drivers depending on the system they use.
Interestingly, Super Cruise users were the most likely “to characterize phone and other peripheral device use, watching videos, grooming, reading, and having hands off the wheel as safe to do while using the system,” which I would suspect is related to the fact that Super Cruise does not require contact with the steering wheel. ProPILOT users were the least likely.
Attention Reminders In Level 2 Systems
All these systems to have Attention Reminders, where if the system thinks you’re not paying adequate attention, measured by either steering wheel torque sensors or eye-tracking cameras, it gives the driver a warning, and, if that warning isn’t heeded, the semi-automated driving assist system may be suspended or disabled.
The study surveyed driver’s “perceived annoyance” at these reminders:
Sometimes ignoring these warnings can cause the driver to be locked out of the driver assist systems if they don’t respond to prompts quick enough, and we get some fascinating details of some of these lockout scenarios Super Cruise users noted, maybe with more specificity than you’d expect:
Moreover, seven Super Cruise users mentioned eating as one of the activities that led to their lockout experience—most of them specifically described taking their hands off the wheel to eat—but none of the Autopilot users cited this reason. Illustrating the challenges of eating while driving, two Super Cruise users said that they had dropped their food, in one case a taco and in another case a hamburger, and had to let go of the wheel to retrieve their meals. Fifty-four percent of Super Cruise and Autopilot users who experienced lockouts said they were at least somewhat annoyed these lockout events happened.
Look at that; IIHS has recorded two dropped food incidents, one a taco-class incident and one a burger-class incident. Fascinating.
In the attention reminder context again we see differences between the systems, though for the most part, drivers seem to understand the need for these reminders. Nissan ProPILOT Assist users got the fewest reminders, and the study has some good insight into why (emphasis mine):
A substantial proportion of ProPILOT Assist users reported never having received attention reminders, which raises the possibility that interactions with the system’s lane-centering feature might also contribute to these group differences. ProPILOT Assist’s lane-centering support remains active while the driver steers within the lane; this characteristic is a component of cooperative steering, or shared haptic control. This design philosophy encourages the driver to actively participate in the steering task (Marcano et al., 2021), helping to reinforce the driver’s role in the relationship and improving the driver’s sense of agency and willingness to intervene whenever necessary or desired (Wen et al., 2019).
The Benefit of Cooperation
Unlike Autopilot or Super Cruise, Nissan’s system is designed to be cooperative, and doesn’t disconnect when the driver gives steering, throttle, or brake inputs. It’s not modal, like the other systems, and as such the driver is never in a position where they would “give up” control of the car to the machine. While this seems like a less advanced way of doing things, I think for semi-automated L2 systems it makes so much more sense, because the real problems arise when the driver thinks the machine has more control than it actually does. If you never imply that the car is in complete control, with the driver only monitoring and instead always keep the driver an active participant in the loop, then many of the vigilance problem issues simply won’t happen.
The study notes this difference between ProPILOT Assist and the other two systems:
In contrast, Autopilot’s lane-centering support deactivates whenever the driver exceeds a (relatively small) threshold amount of steering torque. Super Cruise’s lane-centering support temporarily suspends and only automatically reactivates once the driver has stopped steering and the system has regained the necessary information to position itself within the lane. It is unclear to what extent these design differences influence driver behavior, but the temporary suspension or deactivation of system support might make drivers more reluctant to participate in the driving over time (Banks & Stanton, 2015).
Perhaps counterintuitively, the more work the semi-automation system does, the less it seems to make drivers want to take control when needed, out of fear of the system disengaging. While I don’t think this study has definitively proven this to be the case, it nevertheless is a pretty alarming concept: Driver reluctance to take control so as not to cause the system to disengage has a lot of potential to be dangerous. There should be no hesitation for drivers to take control if they feel uncomfortable in any way.
The Unexpected Effects Of Camera-Based Driver Monitoring Systems
While driver monitoring systems are getting better, even the camera-based ones are not foolproof, and this study also noted Tesla drivers using a water bottle to fool steering wheel torque sensors. Additionally, camera-based driver monitoring systems that don’t require holding the wheel have another disadvantage in that they further the illusion that the system is more capable than it actually is, which permits dangerous behavior like not looking at the road (emphasis mine again);
While steering torque monitoring alone is a poor basis for managing driver behavior (Lin et al., 2018), the current study’s findings indicate that camera-based driver monitoring on its own is not a silver bullet either. The longer a driver looks away from the road the greater their crash risk (Klauer et al., 2006; Yang et al., 2021). Even though Super Cruise uses camera-based monitoring to know where the driver is looking, many Super Cruise users reported being more likely to look away from the road for extended periods while using the system than the drivers in the other two groups. It is unclear why this is the case, but, compared with Autopilot and ProPILOT Assist users, Super Cruise users were also more likely to say that looking away from the road is safe to do while using the system than during unassisted driving and to say that they can do it better and more often with the automation’s support. Clearly, the expectations and attitudes of many of these drivers do not reflect an accurate understanding of the system’s limits.
The study also notes that we’re still in the early stages of adoption of semi-automated systems, and as a result these findings may be skewed to specific “early adopter” behaviors:
As Lin et al. (2018) noted, the scarcity of vehicles equipped with partially automated systems in the registered vehicle fleet means that studies such as this one are presently capturing behavior and perceptions of early adopters. It is unclear how user attitudes and expectations will evolve as system designs change and the technology becomes more widespread.
What I find alarming about the early adopter idea is that of anyone, early adopters should be the most aware of a system’s technical abilities and limitations, shouldn’t they, since they’re so damn interested? Or, could this just mean that the eagerness of an early adopter is enough to make them victims of their own wishful thinking about the capabilities of these systems?
Perception And Marketing Is A Big Deal
Of course, a lot of the issue is with how these systems are marketed and portrayed; names like Autopilot and a heavy emphasis on hands-free driving does lead to a lot of overestimation of these things:
Worryingly, some drivers appear to have a false sense of security about how they are meant to use the technology and what it is designed to do. Misunderstanding the user’s roles and responsibilities corresponds with a higher degree of engagement in nondriving-related activities compared with driving without assistance. This confusion is likely influenced by system design, as some systems are more likely to give drivers the impression that they are more functionally capable than they are.
There’s also some interesting data about the demographics of the groups in the study, none of which is all that surprising: GM’s system, showing up mostly on Cadillacs, skews old. Teslas skew younger and overwhelmingly male, and the Nissans are the group that has demographics closest to just the general population. Again, not shocking.
What is a bit shocking – well, actually, it’s not shocking, because what’s happening is exactly what we’ve known happens with systems that do most of the work, and still somehow try to demand constant vigilance: It doesn’t work. Maybe alarming is a better word, because these systems are already on the roads, and they’re continuing to be built and sold, and they are all, I think, fundamentally, inherently, broken.
The Takeaway
This isn’t something that needs a technological fix, because it’s not a technological problem. It’s a human problem, and it doesn’t matter how many over-the-air updates you send out or what revision the latest Tesla Full Self-Driving Beta is up to, because if any of these systems still demand that a person be ready to take over without warning, they’re doomed.
The better the system is at driving, the worse it will be for keeping people ready and alert, so there’s a bit of a paradox here, too. The more featured the system, the more advanced it is, the more it does, the less it demands from the driver, the worse it actually is. Because if it still may need you to intervene at any moment, the less engaged you are, the worse the result will be if anything should go wrong.
That’s why of all of these, Nissan seems to be on the most workable track: keep the interaction between human and machine cooperative, not one or the other in charge at any given time, and you can still get the safety and reduced-stress benefits of semi-automated assisted driving with less of the inherent dangers caused by vigilance problem-related issues.
Level 2 semi-automated systems are flawed, and I think the IIHS made my point for me, so thanks.
I live in Southeast Kansas. Unlike most of Kansas we have rolling hills, patches of black oak forests, and fairly decent roads. The only real piece of long distant road is US400 which runs from Joplin, MO to Wichita, KS and points further west.
I drive the US400 route about three times a month both ways. I own a 2019 Cadillac CT6 with all manner of things that I have yet to understand let alone master. Among those was a button on the steering wheel that activated a form of lane control. On my last trip I decided to give it a try.
My findings were that while it worked fairly well when the road markings were clear, it really fell apart when they weren’t. Worse still, it didn’t deactivate but continued to wander in the direction of last input. Finally, the necessity of constantly monitoring the system was far more exhausting than simply driving the car like I learned in 1961. Self driving my ass!
I think I’m most concerned about the 6% of Tesla drivers that indicate that they sleep while driving when Autopilot is OFF! And why are Tesla drivers so sleepy?
I think as a society we not only have a distracted driving problem, but we also have a reading comprehension problem.
I have grown accustomed to certain automation helping me:
My climate control maintains a temperature I like, keeping my focus on the road.
Voice commands allow me to dial the probably two people I call when driving without looking at my phone.
Cruise control keeps me from going much faster than intended. Frankly, I should use it more.
Of course, those bits of automation largely improve my focus, rather than allow complacency (arguably, cruise control is a mixed bag, since it can lead to some level of complacency). I do also use lane keep assist some, though I find myself fighting it more often than helped by it in normal use.
I actually really like the level of driver’s assistance on my new truck. Adaptive Cruise, emergency front braking, lane keep assist, blind spot detection, and cross traffic warning. I didn’t think I ever needed those things, but they do help take some fatigue out of long drives. While these features do require active driver participation, I do worry that over time I’ll become more dependent on them. When I go back and drive my 2017, I have to remember I don’t have park sensors or blind spot indicators. I feel like 20 years ago, I was much better at navigating unknown areas with just a quick look at a map and the position of the sun. Now it seems like I always have the GPS up and I don’t really pay attention to the spatial relationships between places. It all kind of makes me want to take my analog Jeep and go live in the North woods for a while.
Not that Im anyone who matters…
But I think its great… that you see the difference in how you drove in your ’17. Imagine the difference if you drove a vehicle 12yrs OLDER than your 17….
I agree with everything here and I am so glad that Autopian is welcoming to us “get off my lawn types.” Just don’t tell the advertisers…
Not surprised by this at all. Hell… there are plenty of drivers not paying attention in vehicles without driver assist tech.
Personally, I’m not really interested in any of these systems at any price until they become good enough that I can legally take a nap in the back seat while the car drives me to my destination.
It frankly doesnt matter to me.. if OL Musky’s company and his laptop on wheels is the one more prone to “problems”, or Nissans, or GMs.
HECK, it wouldnt matter if its Hyundai, Honda, Toyota, BMW, MB, Audi etc etc etc etc. I dont believe they are intrinscally different. They do the same type of bs “job”… but dont do it to a level thats half decent or reliable. —-
Im not for either company in this discussion.. Im frankly against them all for this tech bs. Neither is doing a decent “service” in the ability to drive. They each have their issues of varing degree. Preferring one over the other.. would allow me to believe X company does it better.
Toss in the Low Lease periods and vehicles involved.. and Id rather just not ever buy a new car.. again.
Too much damn bullshit involved in being a decent human being. Dont need tech bs in my car.. attached to my device.
6% of Tesla Autopilot users sleep with the system OFF?
This is something we human factors researchers have been pointing out for decades. This degradation in attention as well as behavioural adaptation has present ever since the introduction of the ABS system (which made people drive with shorter headways).
As an example, research shows that people driving with adaptive cruise control have significantly longer response times to unexpected events. f
We are dealing with a catch 22 issue here as we are relieving the driver off the task of actually driving, enabling distraction and inattention, and then have cameras and other systems assess driver status in order to warn them when they are doing things that we already knew they would do.
Quite frankly, the only systems that have proved to actually improve safety are the systems that do not provide comfort, but focuses on saving the driver in critical cases such as collision warning, auto braking and lane departure prevention.
Great article Jason! I couldnt yet read it all but i get the gist: Lots of morons misusing these systems.
The biggest surprise to me so far is the supercruise data.There are a whole lot more imbeciles in that brand than i expected
In defense of all these Darwin recipient wannabes…
The l2 automated systems and the basics of how aeb, lane keep assist, and other “defenses”, will almost certainly do a better job at not killing and maiming others, while saving the idiots, than if the idiots were in full control to begin with.
I still believe a vehicle should require a full road test with a qualified instructor, on all the features of a vehicle, before a user is allowed to actually drive it on public roads.
Just because we LIKE reading the manuals, and learning HOW the systems work, certainly doesn’t make us the majority we SHOULD be.
I’m finding this article a bit too long and detailed to read while I’m driving my Tesla, particularly as I’m ready for my nap.
The whole Supercruise reveal with the “we will rock you” and the hand clapping was more than a little off putting to me. Let’s have a party in our three ton plus truck pulling a load didn’t seem to be promoting responsible driving behavior. I haven’t seen the commercial for a while and kind of hope the pulled it because of backlash.
I can’t wrap my head around anyone thinking that it is even remotely acceptable to sleep while driving. These people should have their cars crushed and be handed a bus pass.
WOW, This article is just way to long.
Maybe your attention span is just too short.
These types of ambiguities and consequent misunderstandings are among the reasons I prefer to drive older cars so that I may know with great clarity that they are trying to kill me at all times.
How are 2% of the GM owners, 2% of the Nissan owners and an astonishing 6% of the Tesla owners managing to sleep while their car’s driver assist is OFF? How did they manage to live to fill out the survey? Every story I’ve heard about falling asleep behind the wheel involves hitting something or flipping the vehicle within moments. I’m assuming they could get their error bars lower than that with a sample of around 600 people.
Where is sleeping, filming porn, or doing personal hygiene ?
Sleeping is right in the middle and personal grooming is right below that. I’m assuming “watching videos” and/or “using apps on phones” includes all porn related activities.
What gets me is the 6% of Tesla drivers (as well as the 2% of Cadillac and Nissan drivers) that sleep with the systems disengaged!!!
“Sleep when the system is disengaged” suggests either poor questionnaire design, sloppy responses, or both. It’s gotta be the “answer this question FALSE” check question. Those people are not paying attention, just randomly giving answers.
Driving automation needs to make me a better driver, not worse. I’m already a hyper vigilant driver, but even I make mistakes. Round off drivers’ rough edges, don’t take over for them.
I believe psychology is the opposite.
Ford reduced the roof crush safety of its F250s because it had the “rollover control” tech bs. Turns out, the tech bs really makes us all poor. Ford used the rollover tech bs, as a bandaid for cost and roof crush strength.
Driver Automation… is pushing “us” all to use “our devices” MORE because we are becoming complacent in driving.
Same thing has happened with backing up. The types of vehicles that we have been pushed into err walked up that ladder has removed the ability to see.. so theyve added a bandaid.. that camera.
Tech isnt a compliment.. it takes over.
Great that the bulk of the vehicles (GM & Tesla at least) are either super heavy & fast or giant behemoth SUVs – exactly what should be driven by a bunch of distracted / confused morons.
Darwin award winners!
Now if only their “self-driving” car would generously agree not to crush my Miata.
Honestly, NHTSA has really dropped the ball on this. At the risk of sounding like a luddite, I am very much in the “Level 5 or Nothing” camp when it comes to releasing these systems for Consumers. As in, it has to be a system that has proven itself to be as or more capable than a human driver, and we’ve figured out all the accompanying regulatory/insurance issues.
Some people may argue “Well, there’s plenty of crappy drivers on the roads without these systems that can cause accidents or injuries. Why punish people using Level 2/3 autonomy properly because of a few bad apples?” As this data demonstrates, 1) Plenty of people do abuse it and 2) Even people who use it in good faith are at risk of becoming complacent/distracted over time. It’s a niche technology for now and not a pervasive issue yet, but I’d seriously worry if the automakers keep pushing it to a larger audience before we have Level 5 (if that’s even possible in the near term, which I have doubts about). To me, what we have now really isn’t worth the downside risk.
“At the risk of sounding like a luddite”
You say that like it might be a bad thing…
I’ve had the same smartphone for five years and love my ‘05 Wrangler in part because it has no power locks, power windows or any other sophisticated conveniences. Less shit to break. So I am pretty close to one lol
Glad to know… Im not the only one.
I think I have divorced my “smartphone”. Ive re-classified it as what it is.. an idiot d e v i c e . (Smartphone suggests that its a PHONE, although there are no…moving pieces = Device.)
I take car pics with it.. bout it. I dont use it in the car, its face down, 4′ from my driving position.
I have a 05 Element.. which is as far removed from reality than I ever thought.
“a system that has proven itself to be as or more capable than a human driver, and we’ve figured out all the accompanying regulatory/insurance issues.”
I think we have long passed that capability threshold technically but lack the political will to implement it.
That may be true for some of the top tier systems like Waymo. One of the biggest questions though remains liability. I’d be curious at this stage whether even if a system was deemed “Level 5” ready without the disclaimer that you’re the second line of defense you’d get a private insurance company to insure that. At least for a non-exorbitant amount. Will be interesting to see for sure.
The kind of systems that are on robo taxis are far too expensive for consumer vehicles and that is one of the main reasons it will be a long time before consumer level vehicles will be available with level 4 systems.
Reminds me of the Will Smith “I, Robot” movie.
All vehicles were not just Level 5, but being driven and guided by an AI that could coordinate between vehicles.
Oh, and Will Smith’s character’s boss calling him a lunatic for manually driving.
Turns out generations of demonizing government for doing its job and starving it at every turn is bad for public safety
I agree 1000%. I think its total HORSESHIT that the NATIONAL HIGHWAY TRANSPORT SAFETY ADMINISTRATION… is becoming toothless… and that the IIHS *INSURANCE INSTITUTE* of Highway Safety is the one.. thats becoming the go-to. — Since when is a “Operation” like the IIHS.. one that is involved in INSURANCE and telling us what to pay for X and Y.
Thats… just fucked up. IIHS. Screw them.
>But 19% of Tesla Autopilot users have apparently used a laptop or tablet? And 20% watch videos? 18% read a book? What the hell? And 10% of Autopilot users admitted to sleeping?
Based the attitudes of people who own Teslas with that capability, yeah that tracks. I’m surprised the numbers aren’t higher.
I’m not surprised. I talked to a tesla owner the other day who admitted in city traffic (when he lived where rush hour exists) he would often be on his laptop.
The real numbers probably ARE higher. This is self-reported behavior, and reporting is likely to be like when you ask people how much they drink (“only a couple of drinks”, but those are water glasses full of whiskey).
This article certainly brings up one very important question: What kind of monster eats a taco in a car?
It depends on the amount of lettuce and tomatoes on the taco. Jack in the box tacos, very little lettuce thus easy cleanup. Del Taco, not a chance in hell.
@Carl Nichols – Dude!!! It’s Tuesday!!!
Soft tacos wrapped in foil are a perfectly cromulent road-trip food. There are plenty of relatively dry tacos, e.g., fajitas, al pastor, barbacoa, bacon and egg. Just replace the salsa with fresh chopped jalapeños to avoid drippage.
1000% this. Now pull over and eat it outside of the car!
Donut Media did a video on this and the taco consumption wasn’t that messy. Popsicles and bone-in wings seemed to be the worst offenders.
Did they also include different types of doughnuts?
“Holy crap, over half of Super Cruise users didn’t think they needed to “watch what was happening on the road?” ”
Probably because it’s actually a trustworthy system that isn’t just a beta test that someone insists isn’t.
I still wouldn’t, myself personally, but I’d certainly feel lightyears more comfortable with it doing it than Autopilot.
Tesla Autopilot is more advanced that Super Cruise. FSD is the one that is problematic. Autopilot is just traffic aware cruise control.
Which one would you put your life into?
Which one would you want your family or kids sitting in… by themselves?