I’ll be honest with you: I don’t want our site to become a place that just posts about every Tesla Cybertruck wreck, just because (and David’s review of the truck was actually fairly positive; it’s an impressive machine) — even the wrecks that may be at least in part caused by Tesla’s Level 2 semi-automated driving software, still confusingly-named Full Self Driving (FSD). But sometimes there’s just a perfect storm of irony and eye-rollery and embarrassing fanboyism and powerful, important lessons about the sometimes serious limitations of Level 2 driving systems. This is one of those times, thanks to a tweet that is going viral which hits all of these key points, and has a dramatic picture as well. It’s really the total package. We have to talk about it, at least a little.
I should mention that while the Cybertruck here looks pretty badly messed up and is likely totaled, nobody was actually hurt, so that’s good. Modern cars are very good about not killing the beings inside them in wrecks, and we’re seeing a great example of that here.
![Vidframe Min Top](https://images-stag.jazelc.com/uploads/theautopian-m2en/vidframe_min_top1.png)
![Vidframe Min Bottom](https://images-stag.jazelc.com/uploads/theautopian-m2en/vidframe_min_bottom1.png)
In fact, that’s one of the key points of the tweet, which has been seen by 2.4 million people:
Here’s a bigger screenshot of the text:
![Ct Tweet 1](https://images-stag.jazelc.com/uploads/theautopian-m2en/ct_tweet_1.jpg)
Complimenting The Car That Helped Cause The Crash To Keep The ‘Haters’ Away
What many seem to find amazing, and the reason the tweet is going viral, is that the poster seems worried about what the dashcam footage of his wreck will do for the “bears/haters” after his car crashed itself into a pole.
There’s an awful lot here to unpack. Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true, though the Cybertruck does have some impressive passive safety features, and like many modern vehicles, it is generally safe in a wreck, and it is genuinely amazing that the owner here came out without a scratch. And then in the next sentence the owner states that the Cybertruck “failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down” which resulted in the Cybertruck hopping a curb and smacking right into that large streetlight pole, as you can see in the photograph included with the tweet.
Notice I said “owner” there instead of “driver,” because that leads to the next, far more important point. There’s the irony, of course, of thanking Tesla for making a vehicle that’s so safe in a wreck, and then going on to mention that the very same vehicle caused the wreck. It’s very good at dealing with the problem it created!
It’s Proof That These Level 2 Systems Are Problematic
Now, where this gets more complex, and leads to the owner/driver distinction I mentioned before, is that exactly who or what was in control of that 6,000+ pounds of Cybertruck is not clear. The tweet definitely makes it seem like Tesla’s FSD software (noted in the tweet to be version 13.2.4, which is less than a month old) was in full control of the car, which made the decisions that led to crashing into that light pole, and the fact that despite the car heading towards the pole, the human inside did nothing to stop the situation, pretty clearly showing they were not in control or paying attention.
This, of course, is precisely how one is not supposed to use a Level 2 system, which requires constant driver vigilance, and is also the fundamental reason why all semi-automated driving systems are doomed: because people are terrible at using them. And, even worse and more paradoxically, the better a job the Level 2 system does, the worse it is for people to actually use and the more likely it is to be abused, as we see in this situation.
By the way, if you don’t remember what those automated driving levels are, here’s the chart. Remember, the levels don’t indicate how good or advanced the system is, but rather dictate the relationship between the human and the machine when it comes to how the driving responsibility is shared:
All Tesla FSD systems on the road today are Level 2 systems, meaning the human is always responsible, no matter how well the system works or how long it’s been since your last intervention. Under FSD, you have to be ready to take over at a moment’s notice, unless you’re cool with, say, smashing your $100,000 stainless steel truck into a huge post.
Other tweets by the owner seem to suggest a pattern with taking FSD at its name and just treating it like an actual Level 4 or 5 self-driving system – which, again, it isn’t – as you can infer here:
Sometimes I decide to go somewhere and turn on @Tesla FSD and then I forget where I decided to go and then it starts turning into Taco Bell or whatever and I'm like wtf is it doing and then I'm like oh right Taco Bell
thank you for coming to my TED talk
— Jonathan Challinger (@MrChallinger) January 6, 2025
So, the Twitter-poster here gets in his Cybertruck and turns on FSD and is so distracted he doesn’t even remember where he asked it to go? Even when there’s Taco Bell at the end? Bizarre.
This is, of course, precisely the worst way to use any sort of Level 2 driving-assist system and remains the biggest issue with such systems. People are genuinely confused about just how much a driver-assist system does, and even when they’re not confused, people still get incredibly distracted when using driver-assist systems that they’re supposed to be monitoring, because of course they do.
We’ve known this would be a problem since 1948, when Mackworth’s famous study “The Breakdown of Vigilance during Prolonged Visual Search” demonstrated that intense and prolonged vigilance to a mostly independently-operating system is simply not something people are good at. At all. Tesla’s FSD not only calls itself a “Full Self-Driving” system, which sets up unrealistic expectations, but it’s also generally quite good at what it does, which means it is very easy for people to end up in the same situation as Mr.Pole-Pegged Cybertruck over here: too comfortable and then ultimately fucked.
Because no matter how good FSD may be most of the time, it’s not actually a Level 4 or 5 system, and even if it’s been doing fantastic for months, you have no idea when it may suddenly decide to not realize a lane is ending and crash into a pole. I know it’s tempting for ardent defenders of FSD or other automated driving systems to blame these sorts of wrecks on lighting or weather or confusing environmental setups, calling them “edge cases” and suggesting that these circumstances are uncommon, and not worth making a big fuss about.
But here’s the thing about “edge cases”: the world is full of edge cases. That’s what reality is! Messy, chaotic edge cases, with some veneer of order slapped down over everything. Look at where this wreck happened, for example, on that map of a section of Reno, Nevada there. That’s not a particularly challenging area, really. It looks like it was dark, but we’re not talking about some labyrinthine narrow, crowded streets in an ancient European city or anything like that: these are big, well-lit roads that aren’t even particularly dense, traffic-wise. I’m sure the FSD instance in that very same Cybertruck has navigated far more complex situations than this (FSD is a thoroughly impressive system). But, this time, for whatever arcane reason made up of ones and zeros, it got confused and drove right smack into a pole, like an idiot.
The owner should have been able to see this coming with plenty of time to react. If I understand what happened correctly, then the wreck is his fault, no question. He was using this software in a stupid way, and thankfully that stupidity didn’t hurt anyone else. But while the owner is likely entirely to blame, at the same time, that blame is shared with the very concept of a Level 2 driving system itself, which has been shown time and time again to foster this sort of dangerous inattention.
Yes, it appears our Cybertruck-lover wasn’t being particularly smart here, but he was guided to that idiocy by a technology that is technically very advanced, but practically crude and foolish. You can’t ask a machine to take over 90% or more of a task and then expect a person to remain alert and vigilant, at least not all the time. This wreck seems stupid on pretty much every level possible, the driver’s stupidity and the inherent stupidity of Level 2 semi-autonomy working hand-in-hand to make sure this person no longer has a working car.
We’ve reported on these sorts of wrecks before, we’ve told you about the studies and findings and research, and we will continue to do so, because wrecks like this show that the message is still not being heard. FSD, no matter what version number, no matter how few interventions you saw on some YouTube video of a drive, does not drive a car by itself. There still must be a human there, ready and able and willing to take over when it screws up. Which it may never do. Or it may do in the next minute. The point is, you just don’t know.
It’s sort of ridiculous we still are letting all this happen without really trying to find better solutions for the goal of self-driving. Sure, L2 is a great way to get self-driving data, but the way these systems are set up is kind of backwards. When the concept is that the machine is in more control and the human is watching or observing or monitoring, the tendency will always be less vigilance on the human’s part. If this was reversed, and a human was always in control with the software being ready to assist in emergencies or to compensate for driver failure, you could get all the data without introducing these vigilance-induced problems.
Of course, that’s a lot less sexy, and people want to not pay attention if they don’t have to. So here we are.
New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused
This New Study Is Maddening Because We All Know This: People Don’t Pay Attention When Using Autopilot Or Other Driver Assistance Systems
These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb
(Topshot: X, Jonathan Challinger)
For a guy that doesn’t want any attention, it seems odd that he’d add no fewer than 12 @’s to his post, including one in particular that’s probably getting a whole lot of attention these days. And then asks people to “spread his message”.
dude spent a lot of thought on composing that tweet, that’s for sure – many layers of putting out there what happened, while also providing all kinds of cover for all kinds of considerations – likely to maintain some kind of in-universe consistency that is all just bs
This looks to be an almost picture perfect IIHS small overlap test. I’m very curious as to the actual vehicle speed and if the passenger side door will still open or not.
From the driver’s writing it also seems that he does not accept fault and will most likely do it again
Sounds like the typical mindset of that entitled type
Never at fault, never sorry
If he had a Cobb shifter, he wouldn’t have even gotten into the crash in the first place
I’m just imagining a delrin Cobb shifter tacked onto the capacitive touch surface on the roof. You’d look like a train conductor pulling/pushing a lever up there, which would be 20% less silly than using the stock capacitive system.
This is a microcosm of current American society in general right now.
Create a flawed system, then launch a gigantic PR campaign complete with a ton of backup press coverage to convince people the system is better than it really is.
Then when the system fails to live up to what’s promised, convince everyone to start deflecting the fault onto owners, society at large, whoever, just not the actual creator of the system or the PR machine.
They took the old customer service mantra about how it’s better to give a good apology than to make a mistake; then they applied it to life-threatening technology. Well played.
Now try it again, but this time hit a baby stroller first and let’s see how it goes.
It’s sad that it’s going to take a tragedy for people to wake up to the danger of these systems, but as they say: “Regulations are written in blood.”
Why even listen to some fool that could have prevented the accident in the first place but didn’t.
I read the article and now realize I’m going to be afraid any time I see a muskratrod coming at me as the person in it may not be paying any attention or care if they hit a bear(me) because they think they will be passively “safe”.
If his hero Elon gets his way Tesla won’t be bothered to make that POS crashworthy.
This was not a software error or limitation, it was a hardware limitation. The cameras, for whatever technical reason, simply did not ‘see’ the pole. Radar or Lidar would have. And until they add some non-optical sensors, literally no version of this software will ever be even remotely safe. It’s just not possible. Cameras are not eyes no matter how much Musky wants them to be. Furthermore, it seems like FSD is not using any sort of GPS map processing, else it might have been aware of the lane reduction.
I repeat, cameras just won’t do it. Ever. I think all that purple light, probably from a bad streetlight LED lens, just washed out the pole into the background. Humans struggle with this sort of thing, but cameras just have no chance. Light plays tricks, that’s just a fact. Relying on something like that will always fail. It’s just built in.
However, a decision was made and the Sunk Cost Fallacy is in full force. Anybody got a Social Media company they want to sell? I know a guy.
If FSD was using GPS map data, I would hope it would have been aware of the lane reduction AND the pole.
Objects near the road that increase the danger and severity of an accident should be included in the dataset.
so much this – how is the car not, at a bare bare minimum, overlaying map data and position data and deciding that driving forward in this lane is not an option, period?
I fear the NHTSA is about to disagree with you.
The NHTSA is still a thing?
I figured it would have been DOGE-d by now.
The camera missed seeing the pole and didn’t see the curb in front of the pole either.
FSD is no better than a distracted human.
Typical Elon programming, it’s already attacking the poles.
Sometimes I decide to go to France and turn on @Tesla FSD and then I forget where I decided to go and then it starts turning into Belgium or whatever and I’m like wtf is it doing and then I’m like oh right Maginot Line
Eldolfs version starts in Texas and spreads from there. No vaccine available at the moment. Suggested treatment, painted white or reflective tape treated concrete barriers.
Brilliant
COTD
Good thing all those cybercabs will be running around with no drivers this year! Clearly the software is ready to go!!!
Also he doesn’t understand what passive safety is. Crumple zones, airbags etc are active safety. Passive safety is avoiding the accident in the first place.
Look – the passenger door has a safety margin of 5 because it only crumpled 20% of the way back!
Sorry, no. You got it backwards:
“Passive Safety Features:
Passive safety features are parts of the vehicle that protect you in case of an accident. This type of safety feature is part of the whole structure and design of the vehicle. Here are some examples of passive safety features:
Crumple zones in front-end collisions absorb collision energy to keep passengers safe.
Seats with head restraints prevent neck injuries when the person in front seat suddenly stops.
Seat belts.
Airbags.
Side-impact beams prevent or reduce injury from a sideways impact.
Anti-lock braking systems (ABS) help you maintain control during sudden braking.
High strength and shatterproof glass.
Active aerodynamics work together
Active Safety Features:
Active safety features are types of mechanisms that increase your safety by actively trying to avoid car accidents. This is done in many ways. Some systems accomplish their goal by communicating with you and your vehicle. Here are some examples of active safety features:
Active cruise control keeps a safe distance from the cars in front of you.
Active self-correcting steering systems or lane assist makes sure to stay in your lane.
Active brake assist automatically applies brakes harder for maximum braking power when it senses an emergency stop.
ABS prevent wheel lockup by monitoring driver commands and correcting skidding during hard stops.
Active headlight beam aiming systems highlight and follow the road ahead more precisely than conventional headlights.”
https://www.eckellsparks.com/2022/03/02/difference-active-passive-safety-features/
https://www.roadsafetyfacts.eu/passive-safety-systems-what-are-they-and-how-do-they-work/
https://roadsafetyfacts.eu/active-safety-systems-what-are-they-and-how-do-they-work/
You are right, must have been having a brain backwards day
No worries. It happens to all of us.
This would have gone very differently if this was not a pole that was hit but a pedestrian or another vehicle. On the other hand, the guy would have been in jail using his one phone call congratulating a stupid car for his incarceration.
I praised the Lord every day that my Audi lease ended AFTER St. Elon showed his true self.
To think I could have ended up driving a Tesla these days *shudder*
How long ago was that?! You know, my wife and I were trying out names for our first born child (who is now 6, so it was a while ago). She suggested Ilan (a Hebrew name) and I pointed out that a: I hate how capital I and L when they are next to each other look the same, and b: the name sounds almost exactly like “elon” and while he’s seen as a pretty quirkyTony Stark type guy now, I’d hate to name my kid after someone who ends up being a fucking prick in the future. Wow, did we dodge a bullet there!
At least you didn’t go with X Æ A-12.
If that’s his kid’s name, what’s Elon’s wifi password? Aidan?
100x this – and I don’t think I am the kind of person who would get rid of a car because of something like what’s happened with Musk, but I would for sure not buy one!!
There’s a common Australian insult that I think applies to this guy.
I think “fucking idiot” is understood by all English speakers.
Fuck him, and I hope his stupid tweet means his insurance won’t cover it.
This is the second time today we see all the proof someone’s insurance needs to invoke the “You’re A Dumbass” clause.
Yup. It makes me feel dirty to be rooting for an insurer to deny a claim (as if they need any encouragement), but the kind of dumbass who risks the lives of other people using the Man From INCEL’s ‘full self driving’ to control their divorce-mobile doesn’t deserve to have their broken shit made good.
Replace the light pole with elon musk and it would make my day. I’ve been hoping for a while now that he will get run over by a cyber truck “driving itself”
These people are fucking weird
It’s a genuine cult
Yet the stock continues to rise.
It’s a faith based stock, it goes up because people keep buying but the fundamentals are not there. The cult is a lot whose buying daily and buying elevated futures.
The greater fool method of investing. Kind of like bitcoin.
It’s all cults these days.
(…looks around here…)
No no, WE’RE not a cult. Just a bunch of… really committed enthusiasts. It’s different, you see…
Looking at the the street view, it is 600ft from the arrows indicating the lane is ending to the crash site. That’s two football fields of length that this driver wasn’t paying attention.
Exactly! About 300 feet from where the arrows begin to direct you over to where the lane lines curve over then ANOTHER 300 feet until the curb curves over to close the lane.
My best guess is that either they came off the highway exit and just stayed in the right lane or tried to pass on right and got sandwhiched. Either way it’s frightening to me (as the person not protected by the passive safety systems).
And WTF to the guy incriminating himself – in a public forum. Can’t expect his insurance will be fond of him
I think FSD actually stands for Fool Self Driving. That Afrikaans accent can be tricky.
Maybe Faux Self Driving. It’s fancy that way.
I’ll go with Financial Sucker Driving.
$8,000 for the luxury of lulling yourself into a false sense of security, while simultaneously increasing your likelihood of dying (assuming the fatality numbers on that CT vs Pinto article I read are correct).
Pro tip – go with the $99/mo subscription. That way you’re not out all the extra dough when the car is totaled or you’re unfortunately ended by your AI driver not living up to your expectations.
If DOGE works as well as FSD then everything will turn out fine. Right.
As long as the MAGA hat endured no damage, I don’t see what the issue is here. /s
I’m sure he’ll replace it with another one, so that should help boost Tesla’s next quarter sales numbers.
This sad toolbag should never be allowed to drive again. It’s obvious he doesn’t want to, anyway.
The thing I probably hate most about teslas aside from all of it, is that if you are making cars for people who don’t want to drive, why do they have to be either shockingly quick or incredibly heavy?
He shouldn’t be allowed to breed again either.
Darwin missed his chance.
I agree, but held myself back. Believe it or not, I try restrain some of my aggressive negativity on here.
Suicidal Swasticar.
Given the Nazi friendly sympathies of EM (flys off to Germany to congratulate a far right politician- who openly parades in Nazi regalia- and calls him the future of the country) is it any wonder a Cybertruck would attack Poles?
We need our own Alternativ fuer Douchebag. 🙂 And my apologies for making the Pole joke above, I missed yours. Credit where due!
No worries. Besides, there will be some who don’t see my comment, but will see yours, so they’ll still get a chuckle.
Why does everyone think he will attack the Poles? They have many of the same policies inplace that the AfD want to implement. In fact, much of the EU is adapting the same policies as that is what they want.
Why does everyone think he will attack the Poles?
Lebensraum! And those Poles have experience building rockets!
https://wsmrmuseum.com/2020/07/27/von-braun-the-v-2-and-slave-labor/4/
Me when my car crashes itself into a light pole ‘but it’s ok because it’s so safe!!!1111!’
its the Stockholm Syndrome
The type of people who call others “bears” when it comes to criticizing something that ought to be criticized have indeed fallen ill to Stockholm Syndrome, or at least toxic brain rot, which seems to be most of these cybertruck bros.
You bears just don’t understand the appeal of crystal meth, it’s totally safe if you ignore all the deaths! They’re just edge cases, if you use it properly you’ll be fine! It’s the future, accept it already!
‘Bear’ as an insult is a new one on me, is it a reference to the stock market or something?
Yes, accusing people of wanting the stock price to go down.
Thanks, I’ve been sitting here for awhile now wondering why in the world ‘bear’ was an insult!
I thought it was an insult because Incels prefer twinks
I though it was an insult because women generally feel safer with Bears.
Yes, as others explained here, it refers people who want the stock price of company x to go down. Pretty much the only people who use this “insult” are tesla stans explaining that haters of tesla are only doing it to short the stock, rather than pointing out the fantastical delusions of the owners of tesla and the company’s current CEO alike.
that’s not the only SS musk is interested in
That was exactly what I texted my friend when I sent him this story.
Don’t you mean Helsinki Syndrome, as in Helsinki, Sweden?
You’re thinking of Oslo.
I see what you did there – yippie ki yay 😉
I hope the light pole is okay.
It should sue for damages.
If it is damaged the pole-owner (most likely the electric power company) will sue – that is what happens in these types of cases.