The latest edition of AAA’s annual autonomous vehicle sentiment survey is out, and folks, people are not happy about these things. According to the survey, trust in autonomous cars is down, fear is up, and advanced driver assistance systems are still confusing. Time to break down the numbers and discuss what might be at play here.
Let’s start with the headline figure: 68 percent of Americans who responded to the survey say they are afraid of fully-autonomous cars, up 13 percent year-over-year. Afraid, huh?
In addition, the number of respondents who said they trust autonomous vehicles fell from 15 percent to 9 percent year-over-year. Twenty-three percent of respondents remain unsure, down seven percent year-over-year. Add it all up, and the statistics suggest that Americans are cooling to the concept of “self-driving cars,” such as they are.
Their fears aren’t unfounded. The current technology is less than stellar, with several high-profile incidents involving autonomous vehicles occurring over the past year. From repeated instances of Cruise self-driving vehicles blocking roads in San Francisco, to a crash at an unprotected left turn (normally referred to by drivers as a left turn), to Waymo topping NHTSA’s list for autonomous vehicle crashes, it hasn’t been a great year or so for autonomous vehicles.
While the failings of existing technologies don’t necessarily foreshadow future technologies, reports of autonomous vehicles malfunctioning may put a damper on how palatable autonomous vehicles are. I can’t blame anyone for reading reports of Cruise cars holding up traffic and deciding autonomous vehicles aren’t for them. I’d be properly annoyed if the path to my destination was completely obstructed by autonomous reworked Chevrolet Bolts.
That’s on the “autonomous test car” side. Things are equally not-great on the consumer tech side.
While it also hasn’t been a great year for misleadingly-named Level 2 advanced driver assistance systems, that doesn’t tell the whole story about ADAS. According to the AAA survey, 22 percent of respondents believe that systems like Tesla’s Autopilot, Nissan’s ProPilot Assist, and Volvo’s Pilot Assist can make a car drive itself when the exact opposite is the case. The past 12 months have been shaky for some of these driver assistance systems, with Tesla’s FSD Beta under federal investigation by the Department of Justice and under recall, and a NHTSA report finding that both Tesla and Honda’s advanced driver assistance systems were active in a considerable number of crashes.
However, fears and skepticism towards autonomy aren’t mirrored by a decreased desire for advanced driver assistance features. Six out of 10 survey respondents reportedly want SAE Level 2 driver assistance in their next vehicle, which seems unusual given how those same respondents are cooling to autonomous vehicles. If the current tech is confusing crap, which most of it is, why desire it?
Well, some of the systems out there like GM’s Super Cruise work well in slower highway traffic and feature great eye-tracking and smooth hand-offs that help ensure monitoring is done in a safe manner. If options are out there to reduce fatigue and improve focus on monotonous freeway stretches, it shouldn’t be surprising that consumers desire them.
While it’s too soon to say for sure if the tide is turning against self-driving vehicles, this year’s AAA report paints a dismaying picture. It’s possible that some people are realizing that the bar for safe autonomous driving is incredibly high and that it will take years of work to reach that goal, so they aren’t willing to trust autonomous vehicles just yet. It’s equally likely that some people just never will trust self-driving cars.
Either way, fear is on the rise.
Support our mission of championing car culture by becoming an Official Autopian Member.
-
Why Do Autonomous Cars Keep Shutting Down On This One San Francisco Street?
-
A Cruise Autonomous Vehicle Got Into A Strange Crash Last Month That Resulted In Injuries
-
Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems
-
Tesla Has The Most Crashes In NHTSA’s First Advanced Driver Assist Study But What That Means Isn’t Clear
-
Alleged Whistleblower Levels Safety Claims At GM-Owned Autonomous Vehicle Company Cruise
Got a hot tip? Send it to us here. Or check out the stories on our homepage.
Our streets and highways aren’t safe and never will be safe. There are far to many variables. So, we each have to accept the risk and be the best operator that we can be. Your precious won’t be safe, ever, so educate and train them.
I am distressed when I hear a parent demanding that the roads be made safe for their kid. I shows just how ignorant they are and how they don’t accept responsibility for helping the kid survive.
I think it’s pretty sad that 68% of the (probably) small group of Americans the AAA sampled get scared so easily…
Were they also asked whether they are afraid of regular cars piloted by their fellow humans?
Computers screw up because they are programmed poorly or have hardware issues. People fuck up for those same reasons AND because its fun to do bad things.
Fully autonomous cars is such an inefficient backwards solution compared to rail/trolleys/buses/taxis. Uber and Lyft are just union busting taxi companies that somehow got past regulations because money, and they want to not even have to pay drivers.
I’m all for blind spot monitoring/auto braking/adaptive cruise but think that should be about the limit for cars. If you’re not able to drive yourself then call for the community bus.
68% are afraid and 32% are too stupid to live. That’s just an expression, I’m not wishing ill on anyone.
I hate those motherfuckers, I commute to SF and these damn things are all over getting in the way.
This report officially puts us in the “trough of disillusionment” in the hype cycle
My issue would be knowing if the car knows of my existence or if it’s enabled. Like, on the highway, is that Tesla in Autopilot mode and do I need to keep a closer eye on it since it could decide to do something crazy and the driver not prepared to take over?
Just wait until you try to turn the steering wheel and the vehicle responds with:
https://media.tenor.com/Iv6oKRuAhVEAAAAd/hal9000-im-sorry-dave.gif
“It’s possible that some people are realizing that the bar for safe autonomous driving is incredibly high”
They could have realized it much sooner if certain automotive industry types weren’t making unfounded and insanely optimistic claims based on vaporware. If the industry had led with caution (yeah, I know) and managed expectations appropriately, there would probably be less backlash against AVs.
Instead consumers were smacked in the face with the huge delta between the alleged positives and the reality-based negatives. If the AV idea simply didn’t work and the cars just put themselves into parking spaces or whatever, that would be one thing, but when they actually cause traffic jams, accidents, and injuries, the promised positive is replaced with the witnessed negative.
A self-driving car is only as good as the code that governs it. In almost 30 years of working in IT, I have *never* seen code that was perfect. Even if you could make defect-free code, it can’t possibly consider the myriad ways imperfect humans around the self-driving car will react – especially the children among the surrounding pedestrians. Am I “afraid” of autonomous cars? No. Do I trust them to be on our streets? Also no.
As someone with a QA background, I *will* find a way to break that perfect code 😉
Why are companies doing this? Why are they developing autonomous cars? Are we consumers asking for this? Is the government incentivizing this? Do they THINK we’re asking for this? Because from every article and comment thread and conversation I see/am apart of; we’re not.
Came here to say the same thing. What’s the point of all of this?
The point as I understand it, is mostly to replace commercial drivers (Uber, taxis, semis) and realize the savings therein. Any benefit to regular drivers is basically incidental.
To make money.
While no one has asked for this, there is that famous Henry Ford quote where he states that if he’d asked people what they wanted, they would have said “faster horses.” Everybody in this space just trying to “innovate” and create something new that they hope will be the “next big thing”.
Shit, at this point I’m asking for it. Ever since 2016 and especially the pandemic shutdowns, people have been goddamn maniacs on the road. If they’re not distracted by phones or food and weaving all over the road, they’re being hyperaggressive douchecanoes and damn near causing wrecks every 5 minutes. Either we eliminate that problem by eliminating the problems, or we eliminate it by developing and releasing self-driving cars and then make it one fuckton more difficult to get and keep a manual driving license.
I will never trust self driving cars. Since they are mostly designed and programed by imperfect humans who i don’t trust as drivers either.
I agree, but define “trust”.
I don’t have much trust for other drivers, but I trust them enough to actually drive on the same road as them. I trust an Uber driver enough to climb in his car and take the ride to the airport.
I remain wary of self-driving cars, but I’m not sure I’d say I’m “afraid” of them (to quote the article). I have the same level of apprehension when I’m around them that I do when I’m around a human-operated vehicle, which is to say I’m always on alert that they may do something unpredictable that may put me in danger.
The difference for me is that if there’s a human driver I can yell at them for doing something stupid. What recourse do I have if an autonomous taxi does something obviously dangerous?
On the other hand yelling at people for driving stupid never causes them to take a step back and reevaluate their approach to driving. It might make you feel better in the moment, but it does fuck-all to improve traffic safety. The autonomous taxi, unlike the douchewaffle in the brodozer who just cut you off, will not be susceptible to human foibles such as being the above-mentioned douchewaffle, or getting distracted, or tired, or bored, or psychopathic. If we can get an AI to the same driving *ability* as the average driver, it will vastly outperform that average driver because it will use 100% of its ability every second it’s piloting that vehicle, which is a vast improvement over what the equally-capable human will do.
Well, when the #1 proponent of autonomous cars is a deranged, power-mad Bond villain, it isn’t surprising that people are apprehensive.