Automatic emergency braking is now a common feature on many cars. It’s intended to slow a vehicle down when the driver may not be aware of a hazard, or is not able to react in time. However, these systems are not infallible, as one recent story out of China demonstrates.
The matter concerns the Li L9, a full-size luxury SUV from Chinese manufacturer Li Auto. Naturally, it’s equipped with all the usual modern safety equipment, including automatic emergency braking. A few months back, the system ended up putting the L9 in the headlines in China—for all the wrong reasons.
As seen in a video shared on Twitter, the L9 was driving at just under 50 mph on the G70 Expressway, when it passed under a billboard ad for vans. As per the footage, the vehicle can be seen to slow quickly, as if the brakes were slammed on—a dangerous maneuver to undertake on a free-flowing highway.
Edge case for vision-based ADAS? A brand-new LI Auto L9 run on ADAS mistakes the vehicle image in the overhead billboard as an actual stationary vehicle in the lane, resulting AEB to kick in, then hit by a vehicle coming from behind. pic.twitter.com/6ZiCbfV0eH
— Ray (@ray4tesla) May 10, 2024
As reported by Pan Daily, the owner claimed this was due to the automatic emergency braking system, which mistook the vans in the advertisement for a real vehicle. Thus, assuming a collision was imminent, it slammed on the brakes. A rear-end collision was reportedly the result, with the owner demanding 20,000 yuan ($2,750 USD) in compensation.
Li Auto eventually confirmed that sequence of events. “The onboard system mistakenly identified an advertisement image as a real vehicle,” stated a company representative speaking to Pan Daily. The company noted that the AEB software would be improved to avoid this problem in future.
This incident happened despite the sophistication of the assisted driving system in the Li Auto L9. It features lidar, radar, and ultrasonic sensors in addition to multiple cameras. Regardless, all that redundancy wasn’t enough in this case.
It’s an amusing problem that thankfully did not have a major negative outcome. Nobody was seriously injured in the crash. However, it’s easy to understand how frustrating this would be for the driver. It’s one thing for a vehicle to crash while under fully autonomous control. It’s another thing entirely for the computer to step in while you’re doing a fine job of driving, only to help precipitate an accident on its own.
It’s not the first time Li Auto’s automatic systems have come under scrutiny. In 2023, Car News China reported on a driver who said their car spotted “ghosts” as it drove past a cemetery. As they drove through the area, the car’s camera appeared to be detecting humans and cyclists when there was nobody around. “This is not a supernatural event, but is caused by the limitations of sensor recognition capabilities on the market,” responded the automaker, noting that it was merely a sensor error.
A LiAuto PHEV owners club decided to attempt the world record for the “longest consecutive rear-end collision.”
It appears to have been a great success. pic.twitter.com/I2yscL8i3F
— ????????China EV, Engineering & Food???????? (@ChinaEV_Eng_Lif) June 17, 2024
Thanks in part to this story, Li Auto has made the news again more recently for another rear-end incident. This past weekend saw a stack of Li Auto vehicles in a chained rear-end crash on a highway in Shantou, Guangdong during an outing of the local Li Auto car club.
In this case, AEB was involved, but not believed to be to blame. The incident was apparently caused when the lead vehicle in the pack stopped for a red light. A following vehicle was driving too fast in the wet and slippery conditions and was unable to stop in time. Automatic emergency braking engaged, but could not prevent the crash from occurring.
When it comes to the billboard incident, though, it’s hard to say what drivers can do to avoid getting caught out. It’s a rare occurrence, to be sure, but also one few of us would expect to deal with. You could drive around with AEB turned off all the time, but then you’d be losing the benefit of the system. At the same time, if you did end up coming to grief because of an automatic safety intervention, you’d be incredibly upset that you’d crashed through no fault of your own.
Ultimately, the solution is better engineering to avoid uncommon but foreseeable cases like these. When it comes to automatic safety systems, automakers have by and large gotten it right. But as they should, the instances where automatic safety systems do not correctly interpret their surroundings will stand out as an unacceptable danger to the driving public.
Image credits: via Twitter screenshot, Li Auto
We have a ’17 CX-3, love it, but one quirk is that it’s afraid of its own shadow. Literally, when the angle of the sun is just right, the blind spot monitoring reads the shadow on the road as another vehicle.
I can’t believe Jason allowed this header graphic.
And yet it still made me smile and chuckle.
No need to go running around yelling “No wire hangers!” or “Laces out, Dan!”
LOL
With the headlight design on that vehicle it would have looked so strange… but you’re right, the windshield is still wrong. 🙂
True
My 18 accord is frightened of semis, vans, cars and large snowbanks. When it gets close to any of these it engages AEB. It’s selective and only fearful of any of these in white. Any other color in the spectrum it’s fine with.
My ’19 Mercedes is afraid of walls on the outside of turns. Luckily it just beeps at me. Still annoying though.
Occasionally the honda does that when when it gets close to construction barriers.
finally we have found systemic bias against whites in the US. all the hysterical jorts and new balance guys who keep telling me it’s there have been vindicated.
/s if not obvious
Hah, yes I was a little concerned about the car being perceived as biased when I wrote about it, but the honda is white too.
Something similar happened in my Tesla a short while ago – I don’t use FSD (It’s crap), nor do I use Autopilot. But “autopilot stuff” is still active when you’re just using the adaptive cruise control – which I used to use until this:
I was in the collector lane of the 401 in Toronto, zipping along, and the car saw a lane ending sign intended for the express lanes. The car went from driving normally to full regen. With drivers doing 130km/h behind me. It was almost as bad as when cruise turned itself off when the side cameras became blocked due to some sleet-y rain. Same max regen moment, on the 401, only with lower visibility.
So I don’t trust the cruise control anymore. I have no idea how anyone trusts Autopilot or FSD. I just assume they have a death wish.
The 401 is insane at the best of times. There is just so much going on that a computer just can’t handle it. The collectors/express system is confusing enough for a human. I remember driving with my Tom-Tom GPS about 20 years ago and it kept telling me to get into the collectors, then back to express, then to collectors, then to express. There was no reason it should have suggested the collectors as I was going from Guelph to Kingston.
The only time I had an enjoyable drive on the 401 was May 2020 when it was empty.
Huh. Now I have to ask about AEB for other brands. With my Hyundai, I can override the AEB immediately by quickly pushing the throttle down hard and then lifting the throttle.
I have had a handful of AEB misfires where the car started slamming the brakes and that simple throttle tap immediately disabled the AEB. It is pretty simple and super easy to manage. And if I am stupid enough to override the AEB and it was really needed I only have myself to blame.
I actually use this “override” most often when the car properly engages AEB but wants to fully stop the car when the situation has resolved itself. Think a slower moving car changing into into your lane triggering the AEB going full cray-cray emergency braking when in reality it only needs to slow down quickly until the speed differential is eliminated.
Are other manufacturers AEB systems that hard to override?
Tesla you can do the same. It’s rather startling to have the car suddenly try to slow down and have to override it. And on a busy highway? That alone is enough for an accident.
I just had my car do this the other day for seemingly no reason at all that I could figure out. It was on a corner, and there were cars parked on the side of the road, but I drive around corners with cars parked on the road all the time.
It slammed on the brakes at about 45mph, so I slammed on the gas to override it, and it shut up, but it scared the crap out of me.
Not to keep flogging a dead horse, but this is the inevitable outcome of asking hundreds of people to contribute thought-work to incredibly complex projects with life-or-death implications if the management is going to be stupid and greedy. It doesn’t matter whether it’s in China or the good ole US of A, it’s the same problem everywhere.
People are always going to take the actions that lead to keeping their jobs, and it is crystal-clear that in a lot of these workplaces, that means, “Turn in a module that meets the specified technical requirements well enough,” not, “Contribute to the overall safety and performance of the vehicle.”
I would bet my bottom dollar that there is no one single person with sufficient education, resources, and authority to review the sum total of driving assistance mechanisms in a vehicle and consider their net effect on safety. They’re added because a competitor has them and considered to be a “feature” like power windows. Except they might kill you. Truly, software engineering is less giants standing on the shoulders of giants, than idiots standing on the toes of idiots.
Get off my lawn, etc.
Nice to know that AEB systems have evolved to the point of interpreting pictures in addition to actual vehicles.
Back around 2015 I had a Chevy SUV that would hit the brakes about every time it encountered a sweeping right hander with car stopped in a left turn lane. Some days it got interesting with people behind me going 55 MPH and me fighting to get the MFr to not go 25 MPH.
There is a bridge that I cross once a week, and at midpoint my vehicle wants to slam on the brakes EVERY TIME I cross that point. The first time when I first got the car, I had someone not far behind and it was raining and it freaked me TF out. In my immediate panic I had to fight against the brakes with the accelerator in order to not be rear ended. There was nothing ahead of me for a hundred yards or more, and it full-slammed the brakes at 45 mph.
Because of this I’ve had to turn auto braking off, even if it might save my life in some other situation. I have it set to “alert only,” so now every time I cross that bridge I have a involuntary mental and physical bracing response ramping up as I approach the center, waiting for the screaming beeps.
I would much rather have alert only as the default setting. I wonder why your car hates that bridge.
The one ironclad law in the universe is Murphy’s Law and its corollaries.
I had thought that Lidar, seeing a 3d image, would prevent this from happening as the billboard image clearly wasn’t 3D.
So what happens now if some random coyote paints a realistic-looking tunnel on the side of a giant rock in the middle of the road?
Are you a Road Runner, a Coyote or a Freight Train?
You have to remember the principles of the Cartoon Laws of Physics. In this case rule #7 applies. (My personal favorite is rule #9)
Ah, yes, and all American market cars must have AEB that works at highway speeds by 2027 or something.
“Works” is a rather ambiguous term in this case.
I have a great solution for this! Ban advertising billboards from areas near roads!
This!! Billboards are super distracting, not that anyone can see them anyways with their eyes on their phones.
Wacky errors are pretty common on ADAS systems, though this is an extreme case. My Accord will regularly shit its pants when passing a semi on a sunny day. Loads of laughs when it happens on I-94 at rush hour.
Is it bad that my biggest takaway from this article is that modern front end designs have actually nullified (I think it was) Jason’s argument against the Cars universe character design? The one in the main image, along with a lot of newer Hyundai hybrid/EV’s would look like Geordi La Forge if you applied the “correct” character design and used the headlights as the eyes…
Torch’s headlights are the eyes assertion only works on half of car “faces” at best. Usually small roadster type cars
A whole lot of cars have headlights in line with the grille, who has a mouth between their eyes?
Outside of a Guillermo del Toro vision, I can’t think of anything… (yes, it’s an odd reference)
The grille doesn’t have to be the mouth. It could be a nose or nose-like feature. Any horizontal feature, hole, bumper, or air duct toward the bottom of the fascia can be the mouth.
“Smart,” yet dumb.
What struck me was the compensation requested. Add four zeros to that figure if it happens in M’urica.
And then subtract it all when the company somehow shifts liability onto the consumer for utilizing the feature. (Or, best case, the company offers a settlement that also includes an NDA so the owner can’t talk about the issue.)
Clearly Jason didn’t review that topshot.
https://jalopnik.com/how-pixar-screwed-up-cartoon-cars-for-a-generation-of-k-5870976
Although this presents a possible edge case for that argument, considering where the headlights actually are on that car… and a bunch other other cars nowadays.
I was literally typing my comment when you posted!
Although ADAS equipment needs to be positioned a certain way to function well with its FOV, wow that fake roof scoop lidar/sensor housing looks like such an afterthought. Reminds me of the 10th gen Civic surround view camera placement on the mirror.
I hope the same company doesn’t build the sensors and alert systems in China’s nuclear defense systems.
Or ours. Or Russia’s
Unfortunately without Stanislav Petrov on duty so a false alarm incident might be less what actually happened and more what we fear might happen.
https://en.m.wikipedia.org/wiki/1983_Soviet_nuclear_false_alarm_incident
I don’t get it. If the radar/lidar/sonar/whatever doesn’t conclusively determine that there is an object in front of the car, how can it justify slamming on the brakes when it apparently also doesn’t check if there’s anything behind the car?! Like, I’m sorry, but if there’s a Freighliner behind me and we’re goin downhill, I don’t want the AEB to trigger because Mr Squirrel darted across the road. How is the opinion of the most fallible component (the machine-learning “vision” thing) sufficient to cause such a dangerous maneuver?
There must be a lot of Boeing engineers shopping their resumes around.
The problem with edge cases is that there are always going to be a lot of them. The world is messy and computers are pretty terrible at contextualizing new or unexpected things.
You just need to press the accelerator to override AEB, this is not rocket science. They all work like that and have driver overrides, because they are all kinda flawed
Kinda like pressing the brake pedal overrides unintended acceleration.
I don’t have a car with AEB and did not know this was the case. I would argue that we should just get rid of all these driver aids that don’t work right all the time. Having a car randomly slam on the brakes is crazy stupid. If it does it rarely enough that it’s considered to be acceptable, it does it rarely enough that you have to remember what to do to override it before you can take action. If it does it often enough that you have muscle memory to override it, it’s worthless.
Pressing the accelerator seems like a pretty natural response to me…
I think it depends on how hard you have to press the accelerator. To the floor isn’t something that’s part of “ordinary” driving (at least not for me). Either way it still takes measurable time to realize what’s happening and react. Until they can work out substantially all of the possible errors, I don’t think these “driver aids” should be on cars.
Reminder: the dismissability of an edge case is inverse to the potential hazards incurred should that edge case be encountered.
I’m confident any extant self-driving system would fall apart under a rigorous FMEA.
Remember, Murphy was an engineer…