Home » Tesla Owner’s Viral Tweet Praises Safety Of Cybertruck In Wreck Caused By Cybertruck

Tesla Owner’s Viral Tweet Praises Safety Of Cybertruck In Wreck Caused By Cybertruck

Ct Wreck Fsd Top
ADVERTISEMENT

I’ll be honest with you: I don’t want our site to become a place that just posts about every Tesla Cybertruck wreck, just because (and David’s review of the truck was actually fairly positive; it’s an impressive machine) — even the wrecks that may be at least in part caused by Tesla’s Level 2 semi-automated driving software, still confusingly-named Full Self Driving (FSD). But sometimes there’s just a perfect storm of irony and eye-rollery and embarrassing fanboyism and powerful, important lessons about the sometimes serious limitations of Level 2 driving systems. This is one of those times, thanks to a tweet that is going viral which hits all of these key points, and has a dramatic picture as well. It’s really the total package. We have to talk about it, at least a little.

I should mention that while the Cybertruck here looks pretty badly messed up and is likely totaled, nobody was actually hurt, so that’s good. Modern cars are very good about not killing the beings inside them in wrecks, and we’re seeing a great example of that here.

Vidframe Min Top
Vidframe Min Bottom

In fact, that’s one of the key points of the tweet, which has been seen by 2.4 million people:

Here’s a bigger screenshot of the text:

Ct Tweet 1
Screenshot: Twitter

Complimenting The Car That Helped Cause The Crash To Keep The ‘Haters’ Away

What many seem to find amazing, and the reason the tweet is going viral, is that the poster seems worried about what the dashcam footage of his wreck will do for the “bears/haters” after his car crashed itself into a pole.

ADVERTISEMENT

There’s an awful lot here to unpack. Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true, though the Cybertruck does have some impressive passive safety features, and like many modern vehicles, it is generally safe in a wreck, and it is genuinely amazing that the owner here came out without a scratch. And then in the next sentence the owner states that the Cybertruck “failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down” which resulted in the Cybertruck hopping a curb and smacking right into that large streetlight pole, as you can see in the photograph included with the tweet.

Notice I said “owner” there instead of “driver,” because that leads to the next, far more important point. There’s the irony, of course, of thanking Tesla for making a vehicle that’s so safe in a wreck, and then going on to mention that the very same vehicle caused the wreck. It’s very good at dealing with the problem it created!

It’s Proof That These Level 2 Systems Are Problematic

Now, where this gets more complex, and leads to the owner/driver distinction I mentioned before, is that exactly who or what was in control of that 6,000+ pounds of Cybertruck is not clear. The tweet definitely makes it seem like Tesla’s FSD software (noted in the tweet to be version 13.2.4, which is less than a month old) was in full control of the car, which made the decisions that led to crashing into that light pole, and the fact that despite the car heading towards the pole, the human inside did nothing to stop the situation, pretty clearly showing they were not in control or paying attention.

This, of course, is precisely how one is not supposed to use a Level 2 system, which requires constant driver vigilance, and is also the fundamental reason why all semi-automated driving systems are doomed: because people are terrible at using them. And, even worse and more paradoxically, the better a job the Level 2 system does, the worse it is for people to actually use and the more likely it is to be abused, as we see in this situation.

By the way, if you don’t remember what those automated driving levels are, here’s the chart. Remember, the levels don’t indicate how good or advanced the system is, but rather dictate the relationship between the human and the machine when it comes to how the driving responsibility is shared:

ADVERTISEMENT

Sae Levels

All Tesla FSD systems on the road today are Level 2 systems, meaning the human is always responsible, no matter how well the system works or how long it’s been since your last intervention. Under FSD, you have to be ready to take over at a moment’s notice, unless you’re cool with, say, smashing your $100,000 stainless steel truck into a huge post.

Other tweets by the owner seem to suggest a pattern with taking FSD at its name and just treating it like an actual Level 4 or 5 self-driving system – which, again, it isn’t – as you can infer here:

So, the Twitter-poster here gets in his Cybertruck and turns on FSD and is so distracted he doesn’t even remember where he asked it to go? Even when there’s Taco Bell at the end? Bizarre.

ADVERTISEMENT

This is, of course, precisely the worst way to use any sort of Level 2 driving-assist system and remains the biggest issue with such systems. People are genuinely confused about just how much a driver-assist system does, and even when they’re not confused, people still get incredibly distracted when using driver-assist systems that they’re supposed to be monitoring, because of course they do.

We’ve known this would be a problem since 1948, when Mackworth’s famous study “The Breakdown of Vigilance during Prolonged Visual Search” demonstrated that intense and prolonged vigilance to a mostly independently-operating system is simply not something people are good at. At all. Tesla’s FSD not only calls itself a “Full Self-Driving” system, which sets up unrealistic expectations, but it’s also generally quite good at what it does, which means it is very easy for people to end up in the same situation as Mr.Pole-Pegged Cybertruck over here: too comfortable and then ultimately fucked.

Because no matter how good FSD may be most of the time, it’s not actually a Level 4 or 5 system, and even if it’s been doing fantastic for months, you have no idea when it may suddenly decide to not realize a lane is ending and crash into a pole. I know it’s tempting for ardent defenders of FSD or other automated driving systems to blame these sorts of wrecks on lighting or weather or confusing environmental setups, calling them “edge cases” and suggesting that these circumstances are uncommon, and not worth making a big fuss about.

Ct Wreck Map

But here’s the thing about “edge cases”: the world is full of edge cases. That’s what reality is! Messy, chaotic edge cases, with some veneer of order slapped down over everything. Look at where this wreck happened, for example, on that map of a section of Reno, Nevada there. That’s not a particularly challenging area, really. It looks like it was dark, but we’re not talking about some labyrinthine narrow, crowded streets in an ancient European city or anything like that: these are big, well-lit roads that aren’t even particularly dense, traffic-wise. I’m sure the FSD instance in that very same Cybertruck has navigated far more complex situations than this (FSD is a thoroughly impressive system). But, this time, for whatever arcane reason made up of ones and zeros, it got confused and drove right smack into a pole, like an idiot.

ADVERTISEMENT

The owner should have been able to see this coming with plenty of time to react. If I understand what happened correctly, then the wreck is his fault, no question. He was using this software in a stupid way, and thankfully that stupidity didn’t hurt anyone else. But while the owner is likely entirely to blame, at the same time, that blame is shared with the very concept of a Level 2 driving system itself, which has been shown time and time again to foster this sort of dangerous inattention.

Yes, it appears our Cybertruck-lover wasn’t being particularly smart here, but he was guided to that idiocy by a technology that is technically very advanced, but practically crude and foolish. You can’t ask a machine to take over 90% or more of a task and then expect a person to remain alert and vigilant, at least not all the time. This wreck seems stupid on pretty much every level possible, the driver’s stupidity and the inherent stupidity of Level 2 semi-autonomy working hand-in-hand to make sure this person no longer has a working car.

We’ve reported on these sorts of wrecks before, we’ve told you about the studies and findings and research, and we will continue to do so, because wrecks like this show that the message is still not being heard. FSD, no matter what version number, no matter how few interventions you saw on some YouTube video of a drive, does not drive a car by itself. There still must be a human there, ready and able and willing to take over when it screws up. Which it may never do. Or it may do in the next minute. The point is, you just don’t know.

It’s sort of ridiculous we still are letting all this happen without really trying to find better solutions for the goal of self-driving. Sure, L2 is a great way to get self-driving data, but the way these systems are set up is kind of backwards. When the concept is that the machine is in more control and the human is watching or observing or monitoring, the tendency will always be less vigilance on the human’s part. If this was reversed, and a human was always in control with the software being ready to assist in emergencies or to compensate for driver failure, you could get all the data without introducing these vigilance-induced problems.

Of course, that’s a lot less sexy, and people want to not pay attention if they don’t have to. So here we are.

ADVERTISEMENT

 

 

Relatedbar

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

This New Study Is Maddening Because We All Know This: People Don’t Pay Attention When Using Autopilot Or Other Driver Assistance Systems

These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb

 

(Topshot: X, Jonathan Challinger)

ADVERTISEMENT
Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
145 Comments
Inline Feedbacks
View all comments
Shinynugget
Shinynugget
15 hours ago

I purposely bought a version of my current car that doesn’t have lane keep assist or auto braking of any kind. My last car (Corolla Hatch) was a stick the system didn’t like the fact that my foot wasn’t on the brake when it always though it should be. It didn’t understand gearing down, rev matching or anything else. On an auto trans car they may be fine, but in a stick no thanks.

KYFire
KYFire
15 hours ago

“….I’m hesitant because I don’t want the attention and I don’t want to give the bears/haters any material.”

Soooooo, I’ll just post this where everyone can see it along with all the other things about my life I share.

Hangover Grenade
Hangover Grenade
13 hours ago
Reply to  KYFire

Narrator:

He 100% wanted the attention.

TheDrunkenWrench
TheDrunkenWrench
15 hours ago

More fuel to my argument from yesterday.

IRegertNothing, Esq.
IRegertNothing, Esq.
15 hours ago

If 100 Teslas “self drove” over the edge of a collapsed bridge Elon’s fanbois would praise the vehicles for how quickly they maneuvered past all of the barriers.

sentinelTk
sentinelTk
16 hours ago

So I preach at work about utilizing the Heirarchy of Controls, which is a framework to mitigate risk. The higher the level, the better managed the risk. From highest to lowest they are:

Elimination
Substitution
Engineering
Admin/Training
Personal Protective Equipment

Why do I mention this? Because Tesla failed, and continues to fail, the top 4 almost completely, with the exception of engineering. It is their only solution….engineer safer algorithms based on an involuntary public beta test, while meanwhile ignoring where they can eliminate risk or substitute for different solutions (like maybe lidar?). Meanwhile, an untrained “controller,” not driver, then turns around and praises Tesla for the last, and absolute minimum acceptable control: PPE.

Well done all around. Insert 3rd place podium celebration meme here.

TheDrunkenWrench
TheDrunkenWrench
15 hours ago
Reply to  sentinelTk

I’d argue that their testing isn’t Engineering level, but Admin/training. They’re training a machine instead of a person, but they’re not engineering any new tech in to make FSD work. They’re just trying to digitally legislate around the tech they’ve already engineered.

We already know they use “Substitution” a few seconds before accidents, in an attempt to “Eliminate” liability.

Andy the Swede
Andy the Swede
16 hours ago

If only they had lubed their lamp poles, the car would just have slipped past!

Last edited 16 hours ago by Andy the Swede
Jeff Elliott
Jeff Elliott
17 hours ago

“I sat on my ass staring at a giant pole as my car ran straight into it at high speed.”

People have trouble fully paying attention to driving when they are actively driving, “Full Self Driving” seems designed to highlight the risk of semi-autonomous driving over and over like it’s the primary thing it was designed to do.

The World of Vee
The World of Vee
14 hours ago
Reply to  Jeff Elliott

1000% this dude was face down staring at his phone or drunk or asleep. there is no way on earth someone with even 10% of attention to the road could miss something as obvious and his car barreling into a damn pole at high speed.

Sid Bridge
Sid Bridge
17 hours ago

Wow. This one hit my brain hard for some reason. I’m already having trouble processing the fact that I have to share the road with people who have no idea why they’re in the left lane.

Do you mean to tell me I’m already sharing the road with people who have no idea why they got in their car in the first place?

Vetatur Fumare
Vetatur Fumare
14 hours ago
Reply to  Sid Bridge

Not to defend them, but I think that happens to all of us… I am sure everyone has set out to go get groceries and then driven their car halfway to work instead.

Andreas8088
Andreas8088
5 hours ago
Reply to  Vetatur Fumare

Hell, I once set out to go to go somewhere 10 minutes from my house got into a conversation with my girlfriend, and didn’t realize until we’d been driving for an hour that I got on the highway going north instead of south because I was auto-piloting to my old apartment.

Urban Runabout
Urban Runabout
11 hours ago
Reply to  Sid Bridge

Its bad enough that we have people in the left seat with no business being in the left seat.

Horizontally Opposed
Horizontally Opposed
17 hours ago

Proof that meritocracy is dead in America. How is this guy able to make the coin for the truck because, well, bless his heart.

Pupmeow
Pupmeow
17 hours ago

Agreed, but was it ever alive?

Horizontally Opposed
Horizontally Opposed
16 hours ago
Reply to  Pupmeow

Touche. Well let me rephrase: “the illusion of meritocracy is dead in America”. Better?

Jeff Elliott
Jeff Elliott
17 hours ago

84 month financing?

Drive By Commenter
Drive By Commenter
18 hours ago

FSD is fairly impressive, all the way until it isn’t. I’ve driven my Tesla a few thousand miles on v13 thanks to a subscription for a long road trip and then free trial after canceling the subscription. This is exactly the sort of scenario where FSD will mess up! Wet roads at night with ambiguous lane markings are a difficulty for it. The AI will “learn” from human intervention and get progressively better. But the human has to intervene!

Bad human.

Spikersaurusrex
Spikersaurusrex
18 hours ago

This guy is like people who thank god they didn’t die in the disaster that wiped out their home and destroyed everything they own. They just can’t see that they are thanking the root cause of their problem, or “god works in mysterious ways”. Well, I guess Tesla does too.

IRegertNothing, Esq.
IRegertNothing, Esq.
14 hours ago

The Cybertruck giveth and the Cybertruck ploweth into a street lamp.

Captain Muppet
Captain Muppet
21 hours ago

Someone should write a book that clearly yet entertainingly explains how autonomous driving should work and why half-arsing it is stupid and dangerous.

Maybe with charming illustrations with robots in them.

Urban Runabout
Urban Runabout
11 hours ago
Reply to  Captain Muppet

Words are hard.
Can you just send a 20 second Tik-Tok instead?
Because Americans get bored after 25 seconds.

Captain Muppet
Captain Muppet
8 hours ago
Reply to  Urban Runabout

Anyone who gets bored reading Torch’s book doesn’t deserve eyes.

Cheap Bastard
Cheap Bastard
23 hours ago

“Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true”

The 2015 Nissan Altima – blessed with the ultra reliable, ever smooth, and hyper efficient Jatco Xtronic CVT – enters the chat:

https://www.carscoops.com/2021/11/nissan-altima-driver-somehow-survives-being-crushed-by-semi-truck/

And to think it MSRP’d for only $23k.

Phuzz
Phuzz
17 hours ago
Reply to  Cheap Bastard

The Cybertruck doesn’t seem to have been crash tested anywhere official yet, or at least, no results have been released.
It’s ok though, because they did their own testing and told the NTSB that it’s fine. (Which is true of most cars sold in the US).

Óscar Morales Vivó
Óscar Morales Vivó
1 day ago

If you look at the table and figure that the main difference between level 2 and level 3 is who is liable for a collision…

It means we’ll never get past level 2.

Totally not a robot
Totally not a robot
1 day ago

Oh my word I used to live down the street from that site. It’s such a shame that this is what gets my little corner of Reno into the conversation.

My 2 cents — it’s a damn shame Tesla decided to handicap itself by only using visual spectrum cameras. There’s a whole world of light that we can’t see that would be infinitely useful for a system that needs total situational awareness to react to changing environments.

Last edited 1 day ago by Totally not a robot
Xt6wagon
Xt6wagon
1 day ago

My 2004 evo did much the same in a crash. You’d think 20 years you’d do better than mitsu econobox.

Rollin On Donuts
Rollin On Donuts
1 day ago

That wasn’t an accident.
It was just a collision with all the flourishes of an accident.

Last edited 1 day ago by Rollin On Donuts
Slow Joe Crow
Slow Joe Crow
1 day ago

As far as I can tell CyberCucks believe they gain social capital by praising Elon and extolling the virtues of their wank panzers.
I don’t trust Partial Self Driving and Semi Autopilot further than I can throw a bricked inCel Camino. At least the lesser Teslas do less damage when they hit something

Hoonicus
Hoonicus
1 day ago

Admirable write up. Keep restating the obvious facts, and just maybe some minds will come around. That’s a lot of damage to not have sheared the breakaway bolts holding the pole.

Last edited 1 day ago by Hoonicus
Cheap Bastard
Cheap Bastard
23 hours ago
Reply to  Hoonicus

“That’s a lot of damage to not have sheared the breakaway bolts holding the pole.”

My thoughts exactly.

Drew
Drew
15 hours ago
Reply to  Hoonicus

Yeah, I’d really like to know how fast the thing was going. Seems like the businesses, the pole, and the stoplight would suggest it wasn’t all that fast. Lots of opportunity to avoid the accident…and damage that seems extreme for the likely speed.

There are a lot of questions, but this guy definitely won’t release details to anyone who would ask them, since we’re all bears and haters, I guess.

Ricardo Mercio
Ricardo Mercio
1 day ago

I’m glad that he blatantly incriminated himself in a public forum for distracted driving. Hopefully, we can keep this one off the streets for at least a few years.

Amberturnsignalsarebetter
Amberturnsignalsarebetter
1 day ago
Reply to  Ricardo Mercio

Or at the bare minimum discourage his insurers from paying for the damage.

Last edited 1 day ago by Amberturnsignalsarebetter
bomberoKevino
bomberoKevino
1 day ago

Was really pleased to hear someone refer to a cyber truck as an Incel Camino the other day. Had to wonder what the age range of folks getting that joke would be though.

Phuzz
Phuzz
18 hours ago
Reply to  bomberoKevino

I’ve heard Teslas referred to more generally as ‘swasticars’ which made me smile.

StillNotATony
StillNotATony
1 day ago

Austin! Are you ready for Cybercab Thunderdome?!? Because it’s coming to your town!!

JKcycletramp
JKcycletramp
1 day ago
Reply to  StillNotATony

No, thank you for asking.

Vee
Vee
1 day ago

I was stuck behind a Tesla Model X yesterday on a road that had street parking on the right side and no turn-offs for at least half a mile. The Model X was clearly using the level two automation systems because it kept stopping at every. Single. Parked. White. Car. Every fucking one of them. Full on hard brake. After the second time I held back like five car lengths and just watched it do this, lurch forward, then do it again. When we finally got to the intersection that marks the end of that road the fucking thing stopped right as the light turned red because another white car was in the cross traffic. I don’t know why the owner didn’t disable it long before then after it stopped multiple times.

I fully believe this thing crashed into a lightpole based on what I saw yesterday and am surprised it hasn’t happened more.

Roofless
Roofless
1 day ago
Reply to  Vee

Honestly what I’ve seen of Tesla drivers it could very well not have been FSD. I’ve learned to stay a few car lengths further back from them – they’re the most inconsistent, distracted drivers on the road, and that in combination with the regenerative braking and a zillion horsepower makes them just impossible to pace.

Vee
Vee
1 day ago
Reply to  Roofless

I dunno. Stopping at every white car, then stopping in the middle of the intersection on a yellow until it turns red because there’s another white car? I think in the evening light the sensors were falsely detecting the bright white reflections as headlights or something and the car kept freaking out.

Rollin On Donuts
Rollin On Donuts
1 day ago

I pay more attention to where I’m going than these people and I’m high as fuck all the time.

Rollin On Donuts
Rollin On Donuts
1 day ago

Don’t believe me? I’ve never missed my exit… no matter how many lanes I must veer across to catch it at the last second.

Harvey Park Bench
Harvey Park Bench
1 day ago

To be fair it’s fairly easy when Taco Bell is the only place you ever drive to.

Data
Data
16 hours ago

A bad driver NEVER misses their exit.

Patrick
Patrick
1 day ago

When your Tesla would rather be a Polestar..

Harvey Park Bench
Harvey Park Bench
1 day ago
Reply to  Patrick

COTD.

145
0
Would love your thoughts, please comment.x
()
x