Home » Tesla Owner’s Viral Tweet Praises Safety Of Cybertruck In Wreck Caused By Cybertruck

Tesla Owner’s Viral Tweet Praises Safety Of Cybertruck In Wreck Caused By Cybertruck

Ct Wreck Fsd Top
ADVERTISEMENT

I’ll be honest with you: I don’t want our site to become a place that just posts about every Tesla Cybertruck wreck, just because (and David’s review of the truck was actually fairly positive; it’s an impressive machine) — even the wrecks that may be at least in part caused by Tesla’s Level 2 semi-automated driving software, still confusingly-named Full Self Driving (FSD). But sometimes there’s just a perfect storm of irony and eye-rollery and embarrassing fanboyism and powerful, important lessons about the sometimes serious limitations of Level 2 driving systems. This is one of those times, thanks to a tweet that is going viral which hits all of these key points, and has a dramatic picture as well. It’s really the total package. We have to talk about it, at least a little.

I should mention that while the Cybertruck here looks pretty badly messed up and is likely totaled, nobody was actually hurt, so that’s good. Modern cars are very good about not killing the beings inside them in wrecks, and we’re seeing a great example of that here.

Vidframe Min Top
Vidframe Min Bottom

In fact, that’s one of the key points of the tweet, which has been seen by 2.4 million people:

Here’s a bigger screenshot of the text:

Ct Tweet 1
Screenshot: Twitter

Complimenting The Car That Helped Cause The Crash To Keep The ‘Haters’ Away

What many seem to find amazing, and the reason the tweet is going viral, is that the poster seems worried about what the dashcam footage of his wreck will do for the “bears/haters” after his car crashed itself into a pole.

ADVERTISEMENT

There’s an awful lot here to unpack. Of course, the first thing that jumps out to most people is how the Cybertruck owner thanks Tesla for “engineering the best passive safety in the world.” I don’t know if that’s actually true, though the Cybertruck does have some impressive passive safety features, and like many modern vehicles, it is generally safe in a wreck, and it is genuinely amazing that the owner here came out without a scratch. And then in the next sentence the owner states that the Cybertruck “failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down” which resulted in the Cybertruck hopping a curb and smacking right into that large streetlight pole, as you can see in the photograph included with the tweet.

Notice I said “owner” there instead of “driver,” because that leads to the next, far more important point. There’s the irony, of course, of thanking Tesla for making a vehicle that’s so safe in a wreck, and then going on to mention that the very same vehicle caused the wreck. It’s very good at dealing with the problem it created!

It’s Proof That These Level 2 Systems Are Problematic

Now, where this gets more complex, and leads to the owner/driver distinction I mentioned before, is that exactly who or what was in control of that 6,000+ pounds of Cybertruck is not clear. The tweet definitely makes it seem like Tesla’s FSD software (noted in the tweet to be version 13.2.4, which is less than a month old) was in full control of the car, which made the decisions that led to crashing into that light pole, and the fact that despite the car heading towards the pole, the human inside did nothing to stop the situation, pretty clearly showing they were not in control or paying attention.

This, of course, is precisely how one is not supposed to use a Level 2 system, which requires constant driver vigilance, and is also the fundamental reason why all semi-automated driving systems are doomed: because people are terrible at using them. And, even worse and more paradoxically, the better a job the Level 2 system does, the worse it is for people to actually use and the more likely it is to be abused, as we see in this situation.

By the way, if you don’t remember what those automated driving levels are, here’s the chart. Remember, the levels don’t indicate how good or advanced the system is, but rather dictate the relationship between the human and the machine when it comes to how the driving responsibility is shared:

ADVERTISEMENT

Sae Levels

All Tesla FSD systems on the road today are Level 2 systems, meaning the human is always responsible, no matter how well the system works or how long it’s been since your last intervention. Under FSD, you have to be ready to take over at a moment’s notice, unless you’re cool with, say, smashing your $100,000 stainless steel truck into a huge post.

Other tweets by the owner seem to suggest a pattern with taking FSD at its name and just treating it like an actual Level 4 or 5 self-driving system – which, again, it isn’t – as you can infer here:

So, the Twitter-poster here gets in his Cybertruck and turns on FSD and is so distracted he doesn’t even remember where he asked it to go? Even when there’s Taco Bell at the end? Bizarre.

ADVERTISEMENT

This is, of course, precisely the worst way to use any sort of Level 2 driving-assist system and remains the biggest issue with such systems. People are genuinely confused about just how much a driver-assist system does, and even when they’re not confused, people still get incredibly distracted when using driver-assist systems that they’re supposed to be monitoring, because of course they do.

We’ve known this would be a problem since 1948, when Mackworth’s famous study “The Breakdown of Vigilance during Prolonged Visual Search” demonstrated that intense and prolonged vigilance to a mostly independently-operating system is simply not something people are good at. At all. Tesla’s FSD not only calls itself a “Full Self-Driving” system, which sets up unrealistic expectations, but it’s also generally quite good at what it does, which means it is very easy for people to end up in the same situation as Mr.Pole-Pegged Cybertruck over here: too comfortable and then ultimately fucked.

Because no matter how good FSD may be most of the time, it’s not actually a Level 4 or 5 system, and even if it’s been doing fantastic for months, you have no idea when it may suddenly decide to not realize a lane is ending and crash into a pole. I know it’s tempting for ardent defenders of FSD or other automated driving systems to blame these sorts of wrecks on lighting or weather or confusing environmental setups, calling them “edge cases” and suggesting that these circumstances are uncommon, and not worth making a big fuss about.

Ct Wreck Map

But here’s the thing about “edge cases”: the world is full of edge cases. That’s what reality is! Messy, chaotic edge cases, with some veneer of order slapped down over everything. Look at where this wreck happened, for example, on that map of a section of Reno, Nevada there. That’s not a particularly challenging area, really. It looks like it was dark, but we’re not talking about some labyrinthine narrow, crowded streets in an ancient European city or anything like that: these are big, well-lit roads that aren’t even particularly dense, traffic-wise. I’m sure the FSD instance in that very same Cybertruck has navigated far more complex situations than this (FSD is a thoroughly impressive system). But, this time, for whatever arcane reason made up of ones and zeros, it got confused and drove right smack into a pole, like an idiot.

ADVERTISEMENT

The owner should have been able to see this coming with plenty of time to react. If I understand what happened correctly, then the wreck is his fault, no question. He was using this software in a stupid way, and thankfully that stupidity didn’t hurt anyone else. But while the owner is likely entirely to blame, at the same time, that blame is shared with the very concept of a Level 2 driving system itself, which has been shown time and time again to foster this sort of dangerous inattention.

Yes, it appears our Cybertruck-lover wasn’t being particularly smart here, but he was guided to that idiocy by a technology that is technically very advanced, but practically crude and foolish. You can’t ask a machine to take over 90% or more of a task and then expect a person to remain alert and vigilant, at least not all the time. This wreck seems stupid on pretty much every level possible, the driver’s stupidity and the inherent stupidity of Level 2 semi-autonomy working hand-in-hand to make sure this person no longer has a working car.

We’ve reported on these sorts of wrecks before, we’ve told you about the studies and findings and research, and we will continue to do so, because wrecks like this show that the message is still not being heard. FSD, no matter what version number, no matter how few interventions you saw on some YouTube video of a drive, does not drive a car by itself. There still must be a human there, ready and able and willing to take over when it screws up. Which it may never do. Or it may do in the next minute. The point is, you just don’t know.

It’s sort of ridiculous we still are letting all this happen without really trying to find better solutions for the goal of self-driving. Sure, L2 is a great way to get self-driving data, but the way these systems are set up is kind of backwards. When the concept is that the machine is in more control and the human is watching or observing or monitoring, the tendency will always be less vigilance on the human’s part. If this was reversed, and a human was always in control with the software being ready to assist in emergencies or to compensate for driver failure, you could get all the data without introducing these vigilance-induced problems.

Of course, that’s a lot less sexy, and people want to not pay attention if they don’t have to. So here we are.

ADVERTISEMENT

 

 

Relatedbar

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

This New Study Is Maddening Because We All Know This: People Don’t Pay Attention When Using Autopilot Or Other Driver Assistance Systems

These Tesla FSD Hands-Free Mod Devices Are Such A Bad Idea And Promoting Them Is Dumb

 

(Topshot: X, Jonathan Challinger)

ADVERTISEMENT
Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
145 Comments
Inline Feedbacks
View all comments
Rad Barchetta
Rad Barchetta
1 day ago

For a guy that doesn’t want any attention, it seems odd that he’d add no fewer than 12 @’s to his post, including one in particular that’s probably getting a whole lot of attention these days. And then asks people to “spread his message”.

A Reader
A Reader
8 hours ago
Reply to  Rad Barchetta

dude spent a lot of thought on composing that tweet, that’s for sure – many layers of putting out there what happened, while also providing all kinds of cover for all kinds of considerations – likely to maintain some kind of in-universe consistency that is all just bs

subsea_EV-VI
subsea_EV-VI
1 day ago

This looks to be an almost picture perfect IIHS small overlap test. I’m very curious as to the actual vehicle speed and if the passenger side door will still open or not.

1978fiatspyderfan
1978fiatspyderfan
1 day ago

From the driver’s writing it also seems that he does not accept fault and will most likely do it again

S gerb
S gerb
20 hours ago

Sounds like the typical mindset of that entitled type

Never at fault, never sorry

Ranwhenparked
Ranwhenparked
1 day ago

If he had a Cobb shifter, he wouldn’t have even gotten into the crash in the first place

Ricardo Mercio
Ricardo Mercio
1 day ago
Reply to  Ranwhenparked

I’m just imagining a delrin Cobb shifter tacked onto the capacitive touch surface on the roof. You’d look like a train conductor pulling/pushing a lever up there, which would be 20% less silly than using the stock capacitive system.

LTDScott
LTDScott
1 day ago

This is a microcosm of current American society in general right now.

Livernois
Livernois
1 day ago
Reply to  LTDScott

Create a flawed system, then launch a gigantic PR campaign complete with a ton of backup press coverage to convince people the system is better than it really is.

Then when the system fails to live up to what’s promised, convince everyone to start deflecting the fault onto owners, society at large, whoever, just not the actual creator of the system or the PR machine.

Ash78
Ash78
1 day ago
Reply to  LTDScott

They took the old customer service mantra about how it’s better to give a good apology than to make a mistake; then they applied it to life-threatening technology. Well played.

Now try it again, but this time hit a baby stroller first and let’s see how it goes.

Ben
Ben
7 hours ago
Reply to  Ash78

It’s sad that it’s going to take a tragedy for people to wake up to the danger of these systems, but as they say: “Regulations are written in blood.”

Dogpatch
Dogpatch
1 day ago

Why even listen to some fool that could have prevented the accident in the first place but didn’t.
I read the article and now realize I’m going to be afraid any time I see a muskratrod coming at me as the person in it may not be paying any attention or care if they hit a bear(me) because they think they will be passively “safe”.

NosrednaNod
NosrednaNod
1 day ago

If his hero Elon gets his way Tesla won’t be bothered to make that POS crashworthy.

Crank Shaft
Crank Shaft
1 day ago

This was not a software error or limitation, it was a hardware limitation. The cameras, for whatever technical reason, simply did not ‘see’ the pole. Radar or Lidar would have. And until they add some non-optical sensors, literally no version of this software will ever be even remotely safe. It’s just not possible. Cameras are not eyes no matter how much Musky wants them to be. Furthermore, it seems like FSD is not using any sort of GPS map processing, else it might have been aware of the lane reduction.

I repeat, cameras just won’t do it. Ever. I think all that purple light, probably from a bad streetlight LED lens, just washed out the pole into the background. Humans struggle with this sort of thing, but cameras just have no chance. Light plays tricks, that’s just a fact. Relying on something like that will always fail. It’s just built in.

However, a decision was made and the Sunk Cost Fallacy is in full force. Anybody got a Social Media company they want to sell? I know a guy.

PaysOutAllNight
PaysOutAllNight
23 hours ago
Reply to  Crank Shaft

If FSD was using GPS map data, I would hope it would have been aware of the lane reduction AND the pole.

Objects near the road that increase the danger and severity of an accident should be included in the dataset.

A Reader
A Reader
8 hours ago

so much this – how is the car not, at a bare bare minimum, overlaying map data and position data and deciding that driving forward in this lane is not an option, period?

Horizontally Opposed
Horizontally Opposed
14 hours ago
Reply to  Crank Shaft

I fear the NHTSA is about to disagree with you.

Last edited 14 hours ago by Horizontally Opposed
Urban Runabout
Urban Runabout
8 hours ago

The NHTSA is still a thing?
I figured it would have been DOGE-d by now.

Clear_prop
Clear_prop
11 hours ago
Reply to  Crank Shaft

The camera missed seeing the pole and didn’t see the curb in front of the pole either.

FSD is no better than a distracted human.

Ash78
Ash78
1 day ago

Typical Elon programming, it’s already attacking the poles.

Ricardo Mercio
Ricardo Mercio
1 day ago
Reply to  Ash78

Sometimes I decide to go to France and turn on @Tesla FSD and then I forget where I decided to go and then it starts turning into Belgium or whatever and I’m like wtf is it doing and then I’m like oh right Maginot Line

LMCorvairFan
LMCorvairFan
1 day ago
Reply to  Ricardo Mercio

Eldolfs version starts in Texas and spreads from there. No vaccine available at the moment. Suggested treatment, painted white or reflective tape treated concrete barriers.

Last edited 1 day ago by LMCorvairFan
Jb996
Jb996
9 hours ago
Reply to  Ricardo Mercio

Brilliant

Dan Roth
Dan Roth
1 day ago
Reply to  Ash78

COTD

Shooting Brake
Shooting Brake
1 day ago

Good thing all those cybercabs will be running around with no drivers this year! Clearly the software is ready to go!!!

Nathan Williams
Nathan Williams
1 day ago

Also he doesn’t understand what passive safety is. Crumple zones, airbags etc are active safety. Passive safety is avoiding the accident in the first place.

Ranwhenparked
Ranwhenparked
1 day ago

Look – the passenger door has a safety margin of 5 because it only crumpled 20% of the way back!

Cheap Bastard
Cheap Bastard
20 hours ago

Sorry, no. You got it backwards:

“Passive Safety Features:
Passive safety features are parts of the vehicle that protect you in case of an accident. This type of safety feature is part of the whole structure and design of the vehicle. Here are some examples of passive safety features:

Crumple zones in front-end collisions absorb collision energy to keep passengers safe.
Seats with head restraints prevent neck injuries when the person in front seat suddenly stops.
Seat belts.
Airbags.
Side-impact beams prevent or reduce injury from a sideways impact.
Anti-lock braking systems (ABS) help you maintain control during sudden braking.
High strength and shatterproof glass.
Active aerodynamics work together

Active Safety Features:
Active safety features are types of mechanisms that increase your safety by actively trying to avoid car accidents. This is done in many ways. Some systems accomplish their goal by communicating with you and your vehicle. Here are some examples of active safety features:

Active cruise control keeps a safe distance from the cars in front of you.
Active self-correcting steering systems or lane assist makes sure to stay in your lane.
Active brake assist automatically applies brakes harder for maximum braking power when it senses an emergency stop.
ABS prevent wheel lockup by monitoring driver commands and correcting skidding during hard stops.
Active headlight beam aiming systems highlight and follow the road ahead more precisely than conventional headlights.”

https://www.eckellsparks.com/2022/03/02/difference-active-passive-safety-features/

https://www.roadsafetyfacts.eu/passive-safety-systems-what-are-they-and-how-do-they-work/

https://roadsafetyfacts.eu/active-safety-systems-what-are-they-and-how-do-they-work/

Nathan Williams
Nathan Williams
14 hours ago
Reply to  Cheap Bastard

You are right, must have been having a brain backwards day

Cheap Bastard
Cheap Bastard
14 hours ago

No worries. It happens to all of us.

CampoDF
CampoDF
1 day ago

This would have gone very differently if this was not a pole that was hit but a pedestrian or another vehicle. On the other hand, the guy would have been in jail using his one phone call congratulating a stupid car for his incarceration.

SNL-LOL Jr
SNL-LOL Jr
1 day ago

I praised the Lord every day that my Audi lease ended AFTER St. Elon showed his true self.
To think I could have ended up driving a Tesla these days *shudder*

Last edited 1 day ago by SNL-LOL Jr
CampoDF
CampoDF
1 day ago
Reply to  SNL-LOL Jr

How long ago was that?! You know, my wife and I were trying out names for our first born child (who is now 6, so it was a while ago). She suggested Ilan (a Hebrew name) and I pointed out that a: I hate how capital I and L when they are next to each other look the same, and b: the name sounds almost exactly like “elon” and while he’s seen as a pretty quirkyTony Stark type guy now, I’d hate to name my kid after someone who ends up being a fucking prick in the future. Wow, did we dodge a bullet there!

Nlpnt
Nlpnt
1 day ago
Reply to  CampoDF

At least you didn’t go with X Æ A-12.

If that’s his kid’s name, what’s Elon’s wifi password? Aidan?

A Reader
A Reader
8 hours ago
Reply to  SNL-LOL Jr

100x this – and I don’t think I am the kind of person who would get rid of a car because of something like what’s happened with Musk, but I would for sure not buy one!!

Buzz
Buzz
1 day ago

There’s a common Australian insult that I think applies to this guy.

Phuzz
Phuzz
14 hours ago
Reply to  Buzz

I think “fucking idiot” is understood by all English speakers.

Gilbert Wham
Gilbert Wham
1 day ago

Fuck him, and I hope his stupid tweet means his insurance won’t cover it.

Nlpnt
Nlpnt
1 day ago
Reply to  Gilbert Wham

This is the second time today we see all the proof someone’s insurance needs to invoke the “You’re A Dumbass” clause.

Gilbert Wham
Gilbert Wham
10 hours ago
Reply to  Nlpnt

Yup. It makes me feel dirty to be rooting for an insurer to deny a claim (as if they need any encouragement), but the kind of dumbass who risks the lives of other people using the Man From INCEL’s ‘full self driving’ to control their divorce-mobile doesn’t deserve to have their broken shit made good.

Last edited 10 hours ago by Gilbert Wham
Who Knows
Who Knows
1 day ago

Replace the light pole with elon musk and it would make my day. I’ve been hoping for a while now that he will get run over by a cyber truck “driving itself”

Nsane In The MembraNe
Nsane In The MembraNe
1 day ago

These people are fucking weird

No More Crossovers
No More Crossovers
1 day ago

It’s a genuine cult

LMCorvairFan
LMCorvairFan
1 day ago

Yet the stock continues to rise.

Jeff Elliott
Jeff Elliott
12 hours ago
Reply to  LMCorvairFan

It’s a faith based stock, it goes up because people keep buying but the fundamentals are not there. The cult is a lot whose buying daily and buying elevated futures.

Baltimore Paul
Baltimore Paul
8 hours ago
Reply to  LMCorvairFan

The greater fool method of investing. Kind of like bitcoin.

Parsko
Parsko
1 day ago

It’s all cults these days.

(…looks around here…)

Gilbert Wham
Gilbert Wham
10 hours ago
Reply to  Parsko

No no, WE’RE not a cult. Just a bunch of… really committed enthusiasts. It’s different, you see…

Clear_prop
Clear_prop
1 day ago

Looking at the the street view, it is 600ft from the arrows indicating the lane is ending to the crash site. That’s two football fields of length that this driver wasn’t paying attention.

John E runberg
John E runberg
12 hours ago
Reply to  Clear_prop

Exactly! About 300 feet from where the arrows begin to direct you over to where the lane lines curve over then ANOTHER 300 feet until the curb curves over to close the lane.

My best guess is that either they came off the highway exit and just stayed in the right lane or tried to pass on right and got sandwhiched. Either way it’s frightening to me (as the person not protected by the passive safety systems).

And WTF to the guy incriminating himself – in a public forum. Can’t expect his insurance will be fond of him

Canopysaurus
Canopysaurus
1 day ago

I think FSD actually stands for Fool Self Driving. That Afrikaans accent can be tricky.

Colin Buckhurst
Colin Buckhurst
1 day ago
Reply to  Canopysaurus

Maybe Faux Self Driving. It’s fancy that way.

I don't hate manual transmissions
I don't hate manual transmissions
1 day ago
Reply to  Canopysaurus

I’ll go with Financial Sucker Driving.

$8,000 for the luxury of lulling yourself into a false sense of security, while simultaneously increasing your likelihood of dying (assuming the fatality numbers on that CT vs Pinto article I read are correct).

Pro tip – go with the $99/mo subscription. That way you’re not out all the extra dough when the car is totaled or you’re unfortunately ended by your AI driver not living up to your expectations.

Col Lingus
Col Lingus
1 day ago

If DOGE works as well as FSD then everything will turn out fine. Right.
As long as the MAGA hat endured no damage, I don’t see what the issue is here. /s

Cerberus
Cerberus
1 day ago

I’m sure he’ll replace it with another one, so that should help boost Tesla’s next quarter sales numbers.

This sad toolbag should never be allowed to drive again. It’s obvious he doesn’t want to, anyway.

No More Crossovers
No More Crossovers
1 day ago
Reply to  Cerberus

The thing I probably hate most about teslas aside from all of it, is that if you are making cars for people who don’t want to drive, why do they have to be either shockingly quick or incredibly heavy?

Urban Runabout
Urban Runabout
8 hours ago
Reply to  Cerberus

He shouldn’t be allowed to breed again either.
Darwin missed his chance.

Cerberus
Cerberus
8 hours ago
Reply to  Urban Runabout

I agree, but held myself back. Believe it or not, I try restrain some of my aggressive negativity on here.

Jnnythndrs
Jnnythndrs
1 day ago

Suicidal Swasticar.

Canopysaurus
Canopysaurus
1 day ago
Reply to  Jnnythndrs

Given the Nazi friendly sympathies of EM (flys off to Germany to congratulate a far right politician- who openly parades in Nazi regalia- and calls him the future of the country) is it any wonder a Cybertruck would attack Poles?

Ash78
Ash78
1 day ago
Reply to  Canopysaurus

We need our own Alternativ fuer Douchebag. 🙂 And my apologies for making the Pole joke above, I missed yours. Credit where due!

Last edited 1 day ago by Ash78
Canopysaurus
Canopysaurus
1 day ago
Reply to  Ash78

No worries. Besides, there will be some who don’t see my comment, but will see yours, so they’ll still get a chuckle.

Edward Hoster
Edward Hoster
20 hours ago
Reply to  Canopysaurus

Why does everyone think he will attack the Poles? They have many of the same policies inplace that the AfD want to implement. In fact, much of the EU is adapting the same policies as that is what they want.

Cheap Bastard
Cheap Bastard
19 hours ago
Reply to  Edward Hoster

Why does everyone think he will attack the Poles?

Lebensraum! And those Poles have experience building rockets!

https://wsmrmuseum.com/2020/07/27/von-braun-the-v-2-and-slave-labor/4/

Last edited 19 hours ago by Cheap Bastard
Alexander Moore
Alexander Moore
1 day ago

Me when my car crashes itself into a light pole ‘but it’s ok because it’s so safe!!!1111!’

Live2ski
Live2ski
1 day ago

its the Stockholm Syndrome

Last edited 1 day ago by Live2ski
CampoDF
CampoDF
1 day ago
Reply to  Live2ski

The type of people who call others “bears” when it comes to criticizing something that ought to be criticized have indeed fallen ill to Stockholm Syndrome, or at least toxic brain rot, which seems to be most of these cybertruck bros.

Ricardo Mercio
Ricardo Mercio
1 day ago
Reply to  CampoDF

You bears just don’t understand the appeal of crystal meth, it’s totally safe if you ignore all the deaths! They’re just edge cases, if you use it properly you’ll be fine! It’s the future, accept it already!

Phuzz
Phuzz
14 hours ago
Reply to  CampoDF

‘Bear’ as an insult is a new one on me, is it a reference to the stock market or something?

Jeff Elliott
Jeff Elliott
12 hours ago
Reply to  Phuzz

Yes, accusing people of wanting the stock price to go down.

Moonball96
Moonball96
12 hours ago
Reply to  Jeff Elliott

Thanks, I’ve been sitting here for awhile now wondering why in the world ‘bear’ was an insult!

Baltimore Paul
Baltimore Paul
8 hours ago
Reply to  Moonball96

I thought it was an insult because Incels prefer twinks

Urban Runabout
Urban Runabout
8 hours ago
Reply to  Baltimore Paul

I though it was an insult because women generally feel safer with Bears.

CampoDF
CampoDF
11 hours ago
Reply to  Phuzz

Yes, as others explained here, it refers people who want the stock price of company x to go down. Pretty much the only people who use this “insult” are tesla stans explaining that haters of tesla are only doing it to short the stock, rather than pointing out the fantastical delusions of the owners of tesla and the company’s current CEO alike.

No More Crossovers
No More Crossovers
1 day ago
Reply to  Live2ski

that’s not the only SS musk is interested in

JerryLH3
JerryLH3
1 day ago
Reply to  Live2ski

That was exactly what I texted my friend when I sent him this story.

GFunk
GFunk
1 day ago
Reply to  Live2ski

Don’t you mean Helsinki Syndrome, as in Helsinki, Sweden?

Harvey Park Bench
Harvey Park Bench
21 hours ago
Reply to  GFunk

You’re thinking of Oslo.

Moonball96
Moonball96
12 hours ago
Reply to  GFunk

I see what you did there – yippie ki yay 😉

A. Barth
A. Barth
1 day ago

I hope the light pole is okay.

OttosPhotos
OttosPhotos
1 day ago
Reply to  A. Barth

It should sue for damages.

Peter d
Peter d
1 day ago
Reply to  OttosPhotos

If it is damaged the pole-owner (most likely the electric power company) will sue – that is what happens in these types of cases.

1 2 3
145
0
Would love your thoughts, please comment.x
()
x