I’ll be honest with you: I don’t want our site to become a place that just posts about every Tesla Cybertruck wreck, just because (and David’s review of the truck was actually fairly positive; it’s an impressive machine) — even the wrecks that may be at least in part caused by Tesla’s Level 2 semi-automated driving software, still confusingly-named Full Self Driving (FSD). But sometimes there’s just a perfect storm of irony and eye-rollery and embarrassing fanboyism and powerful, important lessons about the sometimes serious limitations of Level 2 driving systems. This is one of those times, thanks to a tweet that is going viral which hits all of these key points, and has a dramatic picture as well. It’s really the total package. We have to talk about it, at least a little.
I should mention that while the Cybertruck here looks pretty badly messed up and is likely totaled, nobody was actually hurt, so that’s good. Modern cars are very good about not killing the beings inside them in wrecks, and we’re seeing a great example of that here.
In fact, that’s one of the key points of the tweet, which has been seen by 2.4 million people:
Here’s a bigger screenshot of the text:
Complimenting The Car That Helped Cause The Crash To Keep The ‘Haters’ Away
What many seem to find amazing, and the reason the tweet is going viral, is that the poster seems worried about what the dashcam footage of his wreck will do for the “bears/haters” after his car crashed itself into a pole.
There’s an awful lot here to unpack. Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true, though the Cybertruck does have some impressive passive safety features, and like many modern vehicles, it is generally safe in a wreck, and it is genuinely amazing that the owner here came out without a scratch. And then in the next sentence the owner states that the Cybertruck “failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down” which resulted in the Cybertruck hopping a curb and smacking right into that large streetlight pole, as you can see in the photograph included with the tweet.
Notice I said “owner” there instead of “driver,” because that leads to the next, far more important point. There’s the irony, of course, of thanking Tesla for making a vehicle that’s so safe in a wreck, and then going on to mention that the very same vehicle caused the wreck. It’s very good at dealing with the problem it created!
It’s Proof That These Level 2 Systems Are Problematic
Now, where this gets more complex, and leads to the owner/driver distinction I mentioned before, is that exactly who or what was in control of that 6,000+ pounds of Cybertruck is not clear. The tweet definitely makes it seem like Tesla’s FSD software (noted in the tweet to be version 13.2.4, which is less than a month old) was in full control of the car, which made the decisions that led to crashing into that light pole, and the fact that despite the car heading towards the pole, the human inside did nothing to stop the situation, pretty clearly showing they were not in control or paying attention.
This, of course, is precisely how one is not supposed to use a Level 2 system, which requires constant driver vigilance, and is also the fundamental reason why all semi-automated driving systems are doomed: because people are terrible at using them. And, even worse and more paradoxically, the better a job the Level 2 system does, the worse it is for people to actually use and the more likely it is to be abused, as we see in this situation.
By the way, if you don’t remember what those automated driving levels are, here’s the chart. Remember, the levels don’t indicate how good or advanced the system is, but rather dictate the relationship between the human and the machine when it comes to how the driving responsibility is shared:
All Tesla FSD systems on the road today are Level 2 systems, meaning the human is always responsible, no matter how well the system works or how long it’s been since your last intervention. Under FSD, you have to be ready to take over at a moment’s notice, unless you’re cool with, say, smashing your $100,000 stainless steel truck into a huge post.
Other tweets by the owner seem to suggest a pattern with taking FSD at its name and just treating it like an actual Level 4 or 5 self-driving system – which, again, it isn’t – as you can infer here:
Sometimes I decide to go somewhere and turn on @Tesla FSD and then I forget where I decided to go and then it starts turning into Taco Bell or whatever and I'm like wtf is it doing and then I'm like oh right Taco Bell
thank you for coming to my TED talk
— Jonathan Challinger (@MrChallinger) January 6, 2025
So, the Twitter-poster here gets in his Cybertruck and turns on FSD and is so distracted he doesn’t even remember where he asked it to go? Even when there’s Taco Bell at the end? Bizarre.
This is, of course, precisely the worst way to use any sort of Level 2 driving-assist system and remains the biggest issue with such systems. People are genuinely confused about just how much a driver-assist system does, and even when they’re not confused, people still get incredibly distracted when using driver-assist systems that they’re supposed to be monitoring, because of course they do.
We’ve known this would be a problem since 1948, when Mackworth’s famous study “The Breakdown of Vigilance during Prolonged Visual Search” demonstrated that intense and prolonged vigilance to a mostly independently-operating system is simply not something people are good at. At all. Tesla’s FSD not only calls itself a “Full Self-Driving” system, which sets up unrealistic expectations, but it’s also generally quite good at what it does, which means it is very easy for people to end up in the same situation as Mr.Pole-Pegged Cybertruck over here: too comfortable and then ultimately fucked.
Because no matter how good FSD may be most of the time, it’s not actually a Level 4 or 5 system, and even if it’s been doing fantastic for months, you have no idea when it may suddenly decide to not realize a lane is ending and crash into a pole. I know it’s tempting for ardent defenders of FSD or other automated driving systems to blame these sorts of wrecks on lighting or weather or confusing environmental setups, calling them “edge cases” and suggesting that these circumstances are uncommon, and not worth making a big fuss about.
But here’s the thing about “edge cases”: the world is full of edge cases. That’s what reality is! Messy, chaotic edge cases, with some veneer of order slapped down over everything. Look at where this wreck happened, for example, on that map of a section of Reno, Nevada there. That’s not a particularly challenging area, really. It looks like it was dark, but we’re not talking about some labyrinthine narrow, crowded streets in an ancient European city or anything like that: these are big, well-lit roads that aren’t even particularly dense, traffic-wise. I’m sure the FSD instance in that very same Cybertruck has navigated far more complex situations than this (FSD is a thoroughly impressive system). But, this time, for whatever arcane reason made up of ones and zeros, it got confused and drove right smack into a pole, like an idiot.
The owner should have been able to see this coming with plenty of time to react. If I understand what happened correctly, then the wreck is his fault, no question. He was using this software in a stupid way, and thankfully that stupidity didn’t hurt anyone else. But while the owner is likely entirely to blame, at the same time, that blame is shared with the very concept of a Level 2 driving system itself, which has been shown time and time again to foster this sort of dangerous inattention.
Yes, it appears our Cybertruck-lover wasn’t being particularly smart here, but he was guided to that idiocy by a technology that is technically very advanced, but practically crude and foolish. You can’t ask a machine to take over 90% or more of a task and then expect a person to remain alert and vigilant, at least not all the time. This wreck seems stupid on pretty much every level possible, the driver’s stupidity and the inherent stupidity of Level 2 semi-autonomy working hand-in-hand to make sure this person no longer has a working car.
We’ve reported on these sorts of wrecks before, we’ve told you about the studies and findings and research, and we will continue to do so, because wrecks like this show that the message is still not being heard. FSD, no matter what version number, no matter how few interventions you saw on some YouTube video of a drive, does not drive a car by itself. There still must be a human there, ready and able and willing to take over when it screws up. Which it may never do. Or it may do in the next minute. The point is, you just don’t know.
It’s sort of ridiculous we still are letting all this happen without really trying to find better solutions for the goal of self-driving. Sure, L2 is a great way to get self-driving data, but the way these systems are set up is kind of backwards. When the concept is that the machine is in more control and the human is watching or observing or monitoring, the tendency will always be less vigilance on the human’s part. If this was reversed, and a human was always in control with the software being ready to assist in emergencies or to compensate for driver failure, you could get all the data without introducing these vigilance-induced problems.
Of course, that’s a lot less sexy, and people want to not pay attention if they don’t have to. So here we are.
New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused
This New Study Is Maddening Because We All Know This: People Don’t Pay Attention When Using Autopilot Or Other Driver Assistance Systems
These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb
(Topshot: X, Jonathan Challinger)
I purposely bought a version of my current car that doesn’t have lane keep assist or auto braking of any kind. My last car (Corolla Hatch) was a stick the system didn’t like the fact that my foot wasn’t on the brake when it always though it should be. It didn’t understand gearing down, rev matching or anything else. On an auto trans car they may be fine, but in a stick no thanks.
“….I’m hesitant because I don’t want the attention and I don’t want to give the bears/haters any material.”
Soooooo, I’ll just post this where everyone can see it along with all the other things about my life I share.
Narrator:
He 100% wanted the attention.
More fuel to my argument from yesterday.
If 100 Teslas “self drove” over the edge of a collapsed bridge Elon’s fanbois would praise the vehicles for how quickly they maneuvered past all of the barriers.
So I preach at work about utilizing the Heirarchy of Controls, which is a framework to mitigate risk. The higher the level, the better managed the risk. From highest to lowest they are:
Elimination
Substitution
Engineering
Admin/Training
Personal Protective Equipment
Why do I mention this? Because Tesla failed, and continues to fail, the top 4 almost completely, with the exception of engineering. It is their only solution….engineer safer algorithms based on an involuntary public beta test, while meanwhile ignoring where they can eliminate risk or substitute for different solutions (like maybe lidar?). Meanwhile, an untrained “controller,” not driver, then turns around and praises Tesla for the last, and absolute minimum acceptable control: PPE.
Well done all around. Insert 3rd place podium celebration meme here.
I’d argue that their testing isn’t Engineering level, but Admin/training. They’re training a machine instead of a person, but they’re not engineering any new tech in to make FSD work. They’re just trying to digitally legislate around the tech they’ve already engineered.
We already know they use “Substitution” a few seconds before accidents, in an attempt to “Eliminate” liability.
If only they had lubed their lamp poles, the car would just have slipped past!
“I sat on my ass staring at a giant pole as my car ran straight into it at high speed.”
People have trouble fully paying attention to driving when they are actively driving, “Full Self Driving” seems designed to highlight the risk of semi-autonomous driving over and over like it’s the primary thing it was designed to do.
1000% this dude was face down staring at his phone or drunk or asleep. there is no way on earth someone with even 10% of attention to the road could miss something as obvious and his car barreling into a damn pole at high speed.
Wow. This one hit my brain hard for some reason. I’m already having trouble processing the fact that I have to share the road with people who have no idea why they’re in the left lane.
Do you mean to tell me I’m already sharing the road with people who have no idea why they got in their car in the first place?
Not to defend them, but I think that happens to all of us… I am sure everyone has set out to go get groceries and then driven their car halfway to work instead.
Hell, I once set out to go to go somewhere 10 minutes from my house got into a conversation with my girlfriend, and didn’t realize until we’d been driving for an hour that I got on the highway going north instead of south because I was auto-piloting to my old apartment.
Its bad enough that we have people in the left seat with no business being in the left seat.
Proof that meritocracy is dead in America. How is this guy able to make the coin for the truck because, well, bless his heart.
Agreed, but was it ever alive?
Touche. Well let me rephrase: “the illusion of meritocracy is dead in America”. Better?
84 month financing?
FSD is fairly impressive, all the way until it isn’t. I’ve driven my Tesla a few thousand miles on v13 thanks to a subscription for a long road trip and then free trial after canceling the subscription. This is exactly the sort of scenario where FSD will mess up! Wet roads at night with ambiguous lane markings are a difficulty for it. The AI will “learn” from human intervention and get progressively better. But the human has to intervene!
Bad human.
This guy is like people who thank god they didn’t die in the disaster that wiped out their home and destroyed everything they own. They just can’t see that they are thanking the root cause of their problem, or “god works in mysterious ways”. Well, I guess Tesla does too.
The Cybertruck giveth and the Cybertruck ploweth into a street lamp.
Someone should write a book that clearly yet entertainingly explains how autonomous driving should work and why half-arsing it is stupid and dangerous.
Maybe with charming illustrations with robots in them.
Words are hard.
Can you just send a 20 second Tik-Tok instead?
Because Americans get bored after 25 seconds.
Anyone who gets bored reading Torch’s book doesn’t deserve eyes.
“Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true”
The 2015 Nissan Altima – blessed with the ultra reliable, ever smooth, and hyper efficient Jatco Xtronic CVT – enters the chat:
https://www.carscoops.com/2021/11/nissan-altima-driver-somehow-survives-being-crushed-by-semi-truck/
And to think it MSRP’d for only $23k.
The Cybertruck doesn’t seem to have been crash tested anywhere official yet, or at least, no results have been released.
It’s ok though, because they did their own testing and told the NTSB that it’s fine. (Which is true of most cars sold in the US).
If you look at the table and figure that the main difference between level 2 and level 3 is who is liable for a collision…
It means we’ll never get past level 2.
Didn’t Mercedes release a Level3? Limited in area: https://media.mbusa.com/releases/release-1d2a8750850333f086a722043c01a0c3-conditionally-automated-driving-mercedes-benz-drive-pilot-further-expands-us-availability-to-the-countrys-most-populous-state-through-california-certification
Oh my word I used to live down the street from that site. It’s such a shame that this is what gets my little corner of Reno into the conversation.
My 2 cents — it’s a damn shame Tesla decided to handicap itself by only using visual spectrum cameras. There’s a whole world of light that we can’t see that would be infinitely useful for a system that needs total situational awareness to react to changing environments.
My 2004 evo did much the same in a crash. You’d think 20 years you’d do better than mitsu econobox.
That wasn’t an accident.
It was just a collision with all the flourishes of an accident.
As far as I can tell CyberCucks believe they gain social capital by praising Elon and extolling the virtues of their wank panzers.
I don’t trust Partial Self Driving and Semi Autopilot further than I can throw a bricked inCel Camino. At least the lesser Teslas do less damage when they hit something
Admirable write up. Keep restating the obvious facts, and just maybe some minds will come around. That’s a lot of damage to not have sheared the breakaway bolts holding the pole.
“That’s a lot of damage to not have sheared the breakaway bolts holding the pole.”
My thoughts exactly.
Yeah, I’d really like to know how fast the thing was going. Seems like the businesses, the pole, and the stoplight would suggest it wasn’t all that fast. Lots of opportunity to avoid the accident…and damage that seems extreme for the likely speed.
There are a lot of questions, but this guy definitely won’t release details to anyone who would ask them, since we’re all bears and haters, I guess.
I’m glad that he blatantly incriminated himself in a public forum for distracted driving. Hopefully, we can keep this one off the streets for at least a few years.
Or at the bare minimum discourage his insurers from paying for the damage.
Was really pleased to hear someone refer to a cyber truck as an Incel Camino the other day. Had to wonder what the age range of folks getting that joke would be though.
I’ve heard Teslas referred to more generally as ‘swasticars’ which made me smile.
Austin! Are you ready for Cybercab Thunderdome?!? Because it’s coming to your town!!
No, thank you for asking.
I was stuck behind a Tesla Model X yesterday on a road that had street parking on the right side and no turn-offs for at least half a mile. The Model X was clearly using the level two automation systems because it kept stopping at every. Single. Parked. White. Car. Every fucking one of them. Full on hard brake. After the second time I held back like five car lengths and just watched it do this, lurch forward, then do it again. When we finally got to the intersection that marks the end of that road the fucking thing stopped right as the light turned red because another white car was in the cross traffic. I don’t know why the owner didn’t disable it long before then after it stopped multiple times.
I fully believe this thing crashed into a lightpole based on what I saw yesterday and am surprised it hasn’t happened more.
Honestly what I’ve seen of Tesla drivers it could very well not have been FSD. I’ve learned to stay a few car lengths further back from them – they’re the most inconsistent, distracted drivers on the road, and that in combination with the regenerative braking and a zillion horsepower makes them just impossible to pace.
I dunno. Stopping at every white car, then stopping in the middle of the intersection on a yellow until it turns red because there’s another white car? I think in the evening light the sensors were falsely detecting the bright white reflections as headlights or something and the car kept freaking out.
I pay more attention to where I’m going than these people and I’m high as fuck all the time.
Don’t believe me? I’ve never missed my exit… no matter how many lanes I must veer across to catch it at the last second.
To be fair it’s fairly easy when Taco Bell is the only place you ever drive to.
A bad driver NEVER misses their exit.
When your Tesla would rather be a Polestar..
COTD.