The Insurance Institute for Highway Safety just released the results of two month-long studies they undertook to determine how drivers interact with semi-automated driving systems, specifically Volvo’s Pilot Assist and Tesla’s Autopilot. Both are Level 2 semi-automated driver assist systems, which means they require constant human driver attention even as the automated system handles a significant portion of the driving task. Unsurprisingly for anyone who has ever met a human being, the studies found that drivers that use these systems are far more likely to focus on things other than driving and are more likely to be distracted. Because no shit. Of course they are.
For each study, a group of drivers (14 for the Tesla study, 29 for the Volvo one) was given access to cars with semi-automated driving systems; these drivers did not have previous experience with such systems and drove these cars using the semi-automated assist systems whenever possible for a period of a month. The big takeaway was that when using these systems, both of which have a number of safeguards in place to prevent driver inattention from the road, what people seemed to be especially good at was learning how the driver attention systems nagged and tried to get attention, and how to get around those safeguards so they could focus on non-driving tasks anyway.
You know, like people do.
Here’s what the results of the Volvo study had to say:
Results: Participants exhibited higher odds of visual-manual distraction or driving hands-free in period 2 when using Pilot Assist relative to manual driving, but patterns differed noticeably across the groups. Pilot Assist use among groups A and B was associated with higher odds in the second period relative to the first, whereas group C exhibited a high level of visual-manual distraction and hands-free driving when using Pilot Assist throughout the 4 weeks of data collection.
Discussion: Our findings suggest that drivers show reliably less attention to the road with the versions of partial automation we tested compared with manual driving that develops with time or is evident relatively early during drivers’ initial trips with the automation. The results support arguments for driver-monitoring solutions that ensure adequate attention to the road. Differences among the samples in patterns of behavior change highlight the need to study factors that may further modify how drivers adapt their behavior when using partial automation such as driving exposure, willingness to use automation, and iterations of partial automation that differ in functional performance.
… and here’s the Tesla results:
Results: We found that drivers learn to internalize safeguard sequences and discover windows of opportunity to do non-driving-related activities. People learned to respond quicker to alerts, leading to fewer escalated sequences in the latter half of the study. However, drivers also spent more time engaging in non-driving-related activities and glancing off-road, which corresponded with more initial alerts of the attention reminder sequence as time went on. Prolonged disengagement culminated in 16 lockouts across the sample, although in general, drivers responded faster and had fewer lockouts over time.
Conclusion: Our findings demonstrate the human ability to learn system constraints and thus illustrate that it is possible to shape safer driving behavior with robust safeguards. User-centric design considerations for driver support strategies are presented in this paper.
In some ways, the conclusions here make me perversely proud to be a human being. You give us rules and constraints and tell us there’s something we can’t do, and you can bet your ass we’ll figure out some way around it. Of course, we often do this even when it directly impairs or endangers our own well-being, as it definitely does here, in the case of paying attention while a semi-automated driving system is in control of your car speeding along at a mile-a-minute.
The humans behind the wheels of both cars found ways to engage in distracted, non-driving behaviors. The Volvo study found that drivers were distracted 30% of the time, which is alarmingly high, and in the Tesla study, over the course of 12,000 total miles, almost 4,000 driver-attention warnings were given, which is equally alarming.
All of this just reinforces a belief I’ve held for a long time: the issues around Level 2 semi-automated driving systems – pretty much the only type of systems available on the market today (unless you count the few Level 3 systems, which are even worse) – are inherently flawed not because of technology reasons, but because of how humans work.
If a human is in a system where some automated machinery is doing most of the work and their job is to monitor things, they will get distracted. This has been well-studied and understood since Mackworth’s famous 1948 study “The Breakdown of Vigilance during Prolonged Visual Search” which demonstrated that intense and prolonged vigilance to a mostly independently-operating system is simply not something people are good at. Monitoring a car that’s mostly driving itself is a task like this, and the danger here is that semi-automated driving systems are by no means perfect. Even though they’re very impressive and getting better and better, they still make mistakes, which is why we require a human to pay attention in the first place.
Sometimes it’s not entirely the fault of the distracted drivers, though; a shocking number of users of automated driving systems think their cars are far more capable of driving on their own than they really are.
Both studies found that drivers will learn to do whatever a semi-automated driving system requires them to do to “prove” they’re paying attention, whether it’s nudging the steering wheel or keeping eyes pointed in a certain direction, but they just learn what the system wants to let them continue to engage in non-driving, distracted activities, like ones noted in the study which include grooming, eating, and, of course, looking at phones.
This could mean that more robust driver-monitoring systems are needed, though my money is always going to be on humans figuring out ways around these more robust systems. Fundamentally, I think Level 2 systems are just incompatible with how human beings actually, realistically operate, and efforts would be better spent developing robust Level 4 geofenced automated driving areas where they’re needed most.
We’ve basically heard all of this before, and still nothing has really been done. I wonder how many more similar IIHS studies we’ll see with essentially the same conclusions before anyone decides any of this matters.
It’s not you, tech. It’s us. We’re the problem.
The Feds Finally Crack Down On Tesla’s ‘Full Self-Driving’ Because It Breaks Traffic Laws Just Like You Do
These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb
It Turns Out That Some Advanced Driver Assistance Systems Might Not Actually Reduce Crashes
New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused
Anybody who has used driver assistance systems and is at all conscious of how they behave differently while they’re active already knows this. Sadly that describes a very small segment of the population.
Even lane keeping (for the couple of weeks I left it on in my new truck, before getting terminally annoyed with it) makes me more inclined to distraction. There’s definitely an element of “Oh, I can look away for a little while because the truck will keep me on the road” that sets in terrifyingly quick. And I’m someone who is aware of that risk, yet it still happened to me a couple of times. It’s no wonder non-car people with more advanced assistance systems fall prey to that on a regular basis.
Speaking of Level 4 automation though, the “may or may not have pedals an steering wheel” bit should be struck from that automation levels graphic imo
Every vehicle on public roads should have the ability of being driven out of the way by a human if it breaks down or if there is an emergency.
It’s also how the tech is named, marketed and explained to the user. I’ve learned one immutable truth, people while find the least effort manner in which to accomplish any task. Most of the time that is ok, the consequences aren’t often life threatening. When you name or brand a product Auto-Pilot, it’s not unreasonable for buyers/drivers to assume it’s just that. That very same term has been used for decades in aviation and we all associate it with a plane flying itself. I’d be willing to bet only licensed knew its limitations for most of that time.
When I was a Linux Sysadmin, if we had a task that was mundane, time consuming and repetitive, we automated it with scripts as much as possible. If you give drivers a system that ‘automates’ driving, they will then put forth the minimum amount of manual effort to accomplish that task. The problem here are the potential consequences of the driver not being engaged in conjunction with the true limits of the technology. Which are more often that not, not well understood by the consumer.
Unless and until true automated driving cars are available and reliable this will continue to be an issue.
Idiots will always outwit the idiotproofers.
“Never argue with an idiot. They will drag you down to their level and beat you with experience” Also, what should society do with them? They are everywhere. It’s like a venn diagram of Idiocracy and Wall-E and humanity is at the center.
The question there is, who is the bigger idiot? The idiot that outsmarted the idiotproofers’ system? Or the idiotproofer that designed a system outwitted by idiots?
“People Don’t Pay Attention When Using Autopilot Or Other Driver Assistance Systems”
This should come as no surprise to anyone who read Jason’s book.
Often the folks who feel the automated systems are more capable than reality are drinking the Musk cool aid. The man has made his living overselling ‘self driving system’.
Another take from Ars Technica: https://arstechnica.com/cars/2024/09/tesla-autopilot-and-other-assists-increase-distracted-driving-study-finds/
Dr. Gitlin is a thoughtful car writer and Ars’ tech angle is a bit outside the mainstream.
I increasingly find people tail gating in France — it used to be really bad, then got much better but in the last 2 years or so getting worse again.
And judging from the cars doing it, it is because their “adaptive cruise control” is on and not very good.
Machine or not, there is no way at dual carriage way speeds that a 1.5 tonne or more vehicle will stop in an emergency in 10 metres.
Time to do the bag of crisps out of the sunroof trick again…
Hopefully one will smear on the Lidar sensors or whatever they use.
Not the bag though, that’s littering. Just a handful of crisps. You can say you were feeding the birds.
And these people knew they were being watched. We all know those who aren’t are orders of magnitude worse. Hell, I’m bad about it with my car’s traffic jam assist.
What’s this I see with the little robots driving cars? Why, it’s a photo-illustration! In the Autopian?? Keep ’em coming!
two thoughts:
first, what happens as these systems age and fail? hopefully the systems are robustly designed to be 100% fail safe – if any fault exists the ADAS completely and overtly disengages any control inputs until fault is confirmed to be corrected. does the car brick in the meantime?
second, who is liable for accident damage involving ADAS? ultimately, the insurance industry and the courts will decide liability, will oems be completely risk-free??
exciting theatre, just a bummer that the experiment is on public roadways – i had not agreed to participate in oems’ experiments.
My hot take – driving is mostly a habitual activity. I don’t think most of us are actively focusing on what we are doing when we are driving.
How many times have you found yourself driving someplace you frequently go, like work, by accident because you were talking to someone and got distracted? But you were still probably driving safely and following the rules of the road because that’s your habit.
If a system has the potential to help a driver develop bad driving habits, then it’s going to happen for some drivers. It means more driver self-vigilance. Personally, I’d like to see more systems to aid drivers in doing better to offset the semi-autonomous, in certain situations systems.
Maybe once a year and it terrifies the shit out of me every time. I always feel like crap that I let myself get distracted like that. But… I’m pretty critical of myself. Like, to an unhealthy degree, probably. I might be an outlier and should not be counted.
My wife and I used to call it being on autopilot. Your brain and body follow the routine as your conscious thoughts are elsewhere. The ability to do it is both terrifying and amazing.
Sometimes I would say to myself, I don’t recall the last 7 miles I drove, but somehow I got here.
The automotive industry are continuing to design solutions to problems they have created themselves.
In their strive to “relief” the driver of heavy burdons such as pressing an accelerator pedal and manoeuvring a steering wheel, they add technology. And guess what? The driver becomes out of the loop!
Then, let’s solve that by adding a driver monitoring system to make sure the driver is paying full attention, even though removing that full attention was the scope of the driver assistance features…