Home » Why The Back Seats Of Pickup Trucks Are More Dangerous Than The Front Seats

Why The Back Seats Of Pickup Trucks Are More Dangerous Than The Front Seats

Iihs Crash Test
ADVERTISEMENT

There’s a big report out from the Insurance Institute of Highway Safety (IIHS) today that rates half-ton pickup trucks “poor” in rear-seat passenger crash tests. In fact, front-seat passengers are now arguably safer than rear-seat passengers in trucks and other vehicles. Surprisingly, this is sort of good news.

Today will be a safety-oriented The Morning Dump, and you can read it if you want to, you can leave your friends behind. [Ed Note: This is a song reference. -Guy who never gets pop culture references (DT)]. General Motors reportedly announced it’ll stop production of its drivereless van, just as Britain says that carmakers will be liable in self-driving crashes.

Vidframe Min Top
Vidframe Min Bottom

And, finally, we’ll talk about V2X technology, which offers the promise of reducing crashes by a significant amount. It’s a complex issue but an important one. And then we’ll introduce you to some friends.

IIHS: Pickup Trucks Need To Work On Rear Seat Safety

Iihs Crash Test Ford

There’s a concept in modern pedagogy of teaching to the test. It’s a fear that kids are only learning what will help them pass tests, and this approach leaves less room for other knowledge. I’m not going to get into that debate, but it’s clear that automakers do this as well. Specifically, automakers want to look good in tests performed by the IIHS, which is an insurance-industry-funded group that does among the most rigorous crash testing in the world.

ADVERTISEMENT

Because every car has a person in the driver’s seat, the initial focus on crash safety was on the driver. That attention shifted to front passengers and, recently, has moved to the rear passengers. Here’s IIHS’s description of its new testing regimen, instituted in 2022:

In 2022, we updated the [moderate overlap] test to address lagging protection for rear occupants. Now, a second, smaller Hybrid III dummy has been added. The second dummy, whose size represents a small woman or an average 12-year-old, is positioned in the second-row seat behind the driver. The updated evaluation incorporates new metrics that focus on the injuries most frequently seen in rear-seat occupants.

It’s understandable that you’d assume the back seat is safer than the front because, in a head-on collision, the rear seat passenger is farther from the point of impact.

At least within the narrow constraints of IIHS testing, it’s not entirely the case. The IIHS tested the four best-selling large trucks in North America (Ram 1500, F-150, Silverado, Ram 1500, Tundra) and found that, in a new moderate overlap front test, all of the trucks ranked “poor” for overall rear passenger restrains and kinematics, though the Tundra overall was rated slightly higher than the others.

That’s not great, but there’s a twist here: Rear seats are relatively unsafer because front seat passenger safety has gotten so much better (in other words, the rear seat may not be horribly unsafe, but compared to the fronts, they are — note that children should remain in the rear, as we underline in a quote below). Look at this graphic from the IIHS for the new front moderate overlap test rear passenger evaluation:

IIHS Truck Ratings
Chart: IIHS

IIHS basically says this themselves:

ADVERTISEMENT

IIHS launched the updated moderate overlap front test last year after research showed that in newer vehicles the risk of a fatal injury is now higher for belted occupants in the second row than for those in front. This is not because the second row has become less safe. Rather, the front seat has become safer because of improved airbags and advanced seat belts that are rarely available in back. Even with these developments, the back seat remains the safest place for children, who can be injured by an inflating front airbag, and the rating does not apply to children secured properly in child safety seats.

Those are important caveats about children. Some moderate overlap “overall ratings” went from the highest rating of good to the worst rating of poor, solely because of the new testing. The good news is that, in a new (as of 2021) side test that uses a heavier barrier  than before, traveling at a higher speed, the trucks mostly did well:

Screen Shot 2023 11 07 At 10.25.28 Am
Chart: IIHS

Cars and trucks continue to get safer and, perhaps, there’s a point of diminishing returns where it becomes less cost-effective to chase tests from a non-governmental organization like the IIHS. With over 40,000 passenger vehicle deaths last year, we’re nowhere near that point, however.

In many ways, the next and most important frontier for vehicle safety is not what happens inside the car, it’s what happens outside, as roadway deaths for pedestrians and cyclists have risen by over 60% in the last decade. I’m going to come back to this point later.

Cruise Halts Driverless Van Production

Cruise Origin
Photo: Cruise

Almost exactly a year ago Ford bailed on its driverless car startup Argo.ai, and signs point to serious struggles at GM-owned Cruise. First Cruise had its permit revoked following a crash where a robotaxi dragged a pedestrian down the street. Then Cruise announced it was suspending all of its driverless taxi service.

Forbes has a pretty big scoop this morning thanks to a recording of an all-hands meeting at Cruise wherein CEO Kyle Vogt says that the company is going to pause production of the fully autonomous Origin van being developed with Honda:

ADVERTISEMENT

According to audio of the address obtained by Forbes, Vogt remarked on the company’s recent decision to halt driverless operations across its entire autonomous vehicle fleet, telling staff that “because a lot of this is in flux, we did make the decision with GM to pause production of the Origin.”

[…]

Vogt told staff during the meeting that the company has produced hundreds of Origin vehicles already, and that is “more than enough for the near-term when we are ready to ramp things back up.”

“During this pause we’re going to use our time wisely,” he added, noting that Cruise was in active discussion with partners and regulators.

This isn’t great news for Cruise but it makes sense. Until the company has buy-in from local governments and more trust with regulators, there’s no reason to keep pouring capital into a project you can’t use. Curiously, Honda showed the vehicle at the Japan Mobility Show, so this decision is likely recent.

Britain: Companies Are Responsible For Self-Driving Crashes

2024 Tesla Model 3 Rear

In the United States, our tiered layers of legislatures and general reactionary poise result in a lot of new concepts and technologies being evaluated first by the judicial branch. This is true in the area of driverless cars, where two successful cases by Tesla have resulted in a sense that individuals are responsible for incidents with driverless cars.

The opposite is often true in Britain, where the government is on the verge of passing a law that clarifies that the maker of a driverless car is responsible for an incident if it occurs while the car is driving itself.

As Reuters reports, this is good for consumers but also probably good for driverless carmakers:

ADVERTISEMENT

Self-driving industry experts have warned that building out national regulatory frameworks and establishing legal liability are crucial for public acceptance of autonomous vehicles and for insurers to provide coverage.

The bill will establish new processes to investigate incidents and improve the safety framework, and will also set the threshold for what is classified as a self-driving car.

If a vehicle falls short of that threshold, the driver will be responsible at all times, the government said, and the bill will prohibit misleading marketing so that cars that fall short of the safety threshold can’t be marketed as self-driving.

This!

US DOT: V2X Could Predict 12% Of Crash Scenarios

2024 Porsche Cayenne Dashboard
Photo: Porsche

Imagine your car telling you that you’re about to drive too fast to stop in time for a crosswalk. Or that a police vehicle is on the side of the road around the corner. Or that a bunch of bicyclists are going to soon be ahead of you on the road.

The technology for this exists and it’s called Vehicle-To-Everything, or V2X, and the U.S. Department of Transportation thinks that it could predict and help drivers avoid 12% of potential crash scenarios.

Automotive News sent a reporter to a meetup of automakers and regulators in Michigan trying to push this technology:

The 5G Automotive Association hosted a demo last month at the University of Michigan’s Mcity test track in Ann Arbor to illustrate V2X that is ready for deployment, including alerts prompted by a staged construction site, a demo wheelchair user in a crosswalk and a simulated emergency response from a police car.

At each of these road dangers, a signal beamed from the hazard into the cockpit of the vehicle and displayed the warning on a tablet or a built-in infotainment center, accompanied by a blip or a chime, to alert the driver.

“These are experiences we all have, where an emergency vehicle is approaching and we don’t know what to do,” said Jyoti Sharma, the senior manager of network planning at Verizon. “If you get an alert, you have some time to react.”

There are some hurdles, though, including deciding who is going to take all this data and disseminate it (probably cellular networks). The biggest hurdle is probably the FCC, which ruled that the 5.9 GHz cell band can be used for V2X applications, but hasn’t explained exactly how yet.

ADVERTISEMENT

Welcome Aftermath To The Independent Media

Aftermathcrew
Drawing: Aftermath

First, there was Defector, made up of primarily ex-Deadspin folks. Then there was us, The Autopian, made up of a lot of former Jalopnik staffers.

We’re excited to welcome Aftermath to the informal club of ex-GMGers, which includes Gita, Riley, Chris Person, Luke, and a bunch of other writers you probably know and love.

Here’s their pitch:

As workers and owners, we’re beholden to no one but ourselves, and to you, our readers. When you subscribe, you’ll get access to writing that pursues the truth and casts a critical eye on gaming and the internet, that doesn’t need to placate capital or kowtow to PR. You’ll be supporting the kind of journalism our past experience has shown us you like best: honest and irreverent, written for people rather than SEO.

If you love video games please follow and support them!

The Big Question

Who should be responsible for a car crash with a driverless car?

ADVERTISEMENT
Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
52 Comments
Inline Feedbacks
View all comments
Space
Space
11 months ago

Wait so if V2X can only stop 12% of crashes then does that mean 88% of accidents are unavoidable?
What falls under that? Maybe:
Drunk drivers
Weather
Car/tyre defects
Distracted pedestrians

Manwich Sandwich
Manwich Sandwich
11 months ago

“Who should be responsible for a car crash with a driverless car?”

The driver. It has always been the driver. And if there is no driver (as in the case of a driverless car), then the owner of the vehicle takes responsibility.

That’s what many legal precedents have said and continue to say.

This has never been a complicated issue or even a subject of debate.

Last edited 11 months ago by Manwich Sandwich
Younork
Younork
11 months ago

This is very much a complicated issue as well as a debated topic. Making the manufacturer take the brunt of the liability makes a lot of sense because the end consumer has zero input in the software or hardware which makes the vehicle self-drive. While the manufacturer very much has input into those things. It is important to note that this is likely talking about a full self-driving system, not a driver-assist system similar to Tesla’s, Blue Cruise or Super Cruise.

Manwich Sandwich
Manwich Sandwich
11 months ago
Reply to  Younork

The manufacturer is only liable if there is a defect. That isn’t complicated either.

Now in the case of self driving not working the way it should, then you can argue that it’s a safety defect

Younork
Younork
11 months ago

Wouldn’t any time a self-driving vehicle crashes constitute it not working the way it should and therefore be a defect?

Manwich Sandwich
Manwich Sandwich
11 months ago
Reply to  Younork

Depends on the system. What we have right now in Teslas and other vehicles is just an advanced driver *assistance* system… where they make it clear that you, as the driver, still need to pay attention and keep your hands on the wheel.

Now a self-driving car on the other hand MIGHT be considered as having a defect if there is a crash… but not if the crash happened because someone else did something stupid.

Rad Barchetta
Rad Barchetta
11 months ago

but it’s clear that automakers do this as well. 

100% true. A bunch of years ago I built some automation equipment for a company that made car doors. This machine processed both driver and passenger doors on the same car, which had different internal structures. I asked my customer why that was. His reply: “They don’t crash test the passenger side”.

SonOfLP500
SonOfLP500
11 months ago

Curiously, Honda showed the vehicle at the Japan Mobility Show, so this decision is likely recent.

I spoke to the Honda representative in charge of the Origin at the Japan Mobility Show on October 26th, and as far as he was aware Honda was all-systems-go for driverless taxi trials in Tokyo by 2026. Next morning the news was Cruise pulling out and everything on hold.

Last edited 11 months ago by SonOfLP500
Droid
Droid
11 months ago

Who should be responsible for a car crash with a driverless car?
j.torchinsky’s excellent book “robot, take the wheel” provides a framework: SAE J3016 level 3 is “self driving” but is commercially vaporware. level 2 is the highest performance available, but has a fatal flaw: “when in trouble, delegate” -i.e. if the the level2 system hiccups with a situation it can’t handle, it gives command back to the operator, with little or no warning – but humans suck at monitoring such a system and chaos ensues.
so i suspect that legally the operator of a level2 driver aid gets the liability – but i think that ethically it lies with the producer.

LuzifersLicht
LuzifersLicht
11 months ago
Reply to  Droid

” but i think that ethically it lies with the producer”

IMO only if the producer has been misleading about what they’re selling. And British lawmakers seem to agree.
If you sell a car as “has advanced driver aids that take over part of your workload so you can concentrate on traffic while the car does the rest” then any driver should be able to understand that they are responsible for the vehicle.
If, on the other hand, you sell a car that “does all the work for you, – just lean back and enjoy the ride as our advanced drivers aids make sure you get to your destination” then that sounds like the car is going to do the driving and in this case the manufacturer better make damn sure it can actually do so.

Last edited 11 months ago by LuzifersLicht
Droid
Droid
11 months ago
Reply to  LuzifersLicht

yes, there needs to be clarity about what is being sold: it’s level2.
level2 = legal operator liability.
if a level2 system is misrepresented as level3, e.g.fsd, then my sense is that the ethical liability tends to the manufacturer.

Dangerous_Daveo
Dangerous_Daveo
11 months ago
Reply to  Droid

I think it needs to be situation based too… If my car (which it has done..) decides halfway through braking in normal traffic when in radar cruise, its done with braking, that is kind of a dick move.

As is if it decides something is something else at just the wrong time, such as maybe steering down an off ramp as it is following a line, and you’ve got to wrestle the thing back to stay on the highway, but timing is bad and you hit something as a result, it is a poor example.

These are maybe instances where if you were in full control yourself just wouldn’t be an issue, I know I’m not going down the off ramp, so I just don’t think about it doing something different. So either the system works, or it’s redundant because it is the thing causing the issue that wouldn’t exist if it wasn’t for it doing it’s own thing.

Don’t get me wrong though, I really like radar cruise, and it works great most of the time, but I’m constantly hovering the brake, and often brake earlier than it is.

Anyway, way I see it, unless everything can talk to everything all the time, we’ll never see anything fully self driving. And to get to that point, a hell of a lot of things need to get banned, and people need to accept that a lot more of their life is going to be tracked for the convenience.

Fuzzyweis
Fuzzyweis
1 year ago

V2X could be really great, but wondering how much it will be fought for those opposed to be being tracked.

I really like the concept of Waze but in my daily commute there’s not enough folks around me using it to get traffic or emergency vehicle warning, unless something’s been there long enough like a busted traffic signal and police doing the waving.

Also if it works like Waze where we create some alerts that may lead to info overload. I’m currently in a battle over ‘what is a pothole’ with another user, almost every day I get the ‘pothole ahead’ alert in the same spot, it’s a patch, which yes does have a slight dip, but it is not a pothole, whoever keeps reporting it as a pothole needs to drive New England roads for a season, then come back and tell me how a patch that was put in years ago and is still fully intact is somehow a ‘pothole’. So I say no longer there, next day, it’s back! Some days it’s not there and I worry about my mysterious nemesis, are they ok? Did the pothole get them? But after a few days the alert comes back, and I feel better, but the battle continues.

Mrbrown89
Mrbrown89
1 year ago

The most similar thing available right now to V2X for all the drivers using a smartphone (and much better if integrated to AA/Carplay) is Waze. When I see alerts that there is a hazard ahead of me, a car on the side of the road (in case I need to stop for a reason), those alerts make me pay way more attention to my surroundings

The only “issue” with Waze is that is based on users registering the data, not actual vehicle data that can register if there are slippery conditions, a car activated their airbags for some reason. If only Waze could have access to vehicle data and automatically send reports based on your driving (but anonymously) that would be awesome

DaChicken
DaChicken
1 year ago

Who should be responsible for a car crash with a driverless car?

The first step is actually coming up with a legal definition of “driverless car.” Right now, we have a lot of marketing lingo and numerous interpretations of the technology but nothing concrete to work from.

To me, driverless means that the human has no requirement or expectation to control the car when it’s in a “driverless” mode. At that point, any error (not necessarily every accident) is the manufacturer’s problem. If the mode only meets the definition of a driver aid and the human is required/expected to take control at a moments notice then it’s the human’s problem.

Even then, I’d expect most cases to pass through the courts to determine responsibility, anyway. It’s pretty much that way already with the humans in charge. There’s too many edge cases and differences in interpretation to make things easy, but having some laws with clear lines would help frame the arguments and eventually get case law on the books to work from as the tech shakes out.

TDI_FTW
TDI_FTW
1 year ago

At least within the narrow constraints of IIHS testing, it’s not entirely the case. The IIHS tested the four best-selling large trucks in North America (Ram 1500, F-150, Silverado, Ram 1500, Tundra) and found that, in a new moderate overlap front test, all of the trucks ranked “poor” for overall rear passenger restrains and kinematics, though the Tundra overall was rated slightly higher than the others.

The only way RAM could keep up is by double counting themselves!

Shop-Teacher
Shop-Teacher
1 year ago

Who should be responsible for a car crash with a driverless car?”

When they begin actually selling those, the manufacturer. But nobody has yet, so still the driver.

But I do believe Tesla is guilty of deceptive marketing, and bears some level of culpability. “Full self driving” just is not what the name suggests. And the controls on the system are crucially lax.

Harmanx
Harmanx
1 year ago
Reply to  Shop-Teacher

How about if they just renamed it to “Aspirational Full Self Driving” until it’s no longer in beta?

Shop-Teacher
Shop-Teacher
1 year ago
Reply to  Harmanx

HA!

Drew
Drew
1 year ago
Reply to  Shop-Teacher

Well, Cruise was operating some driverless cars. Then they hid data during a nasty accident, so they shouldn’t be operating now. I think the question, when it arises, will initially be between manufacturer and company operating them, because I think we’re going to see driverless cabs more often than driverless personal vehicles when and if we see them really rolling out.

Shop-Teacher
Shop-Teacher
1 year ago
Reply to  Drew

Good points.

Canopysaurus
Canopysaurus
1 year ago

By “driverless car” are you referring to vehicles with literally no human operator or so-called “self-driving” cars?

If it’s the former, obviously the company is it at fault for accidents. If it’s the latter, that is a can of worms.

Drivers are ultimately supposed to be in control whether operating their vehicles directly or in self-driving mode as the recent judgement for Tesla in CA affirms. However, in the case of a self-driving system failure in a high speed environment, it doesn’t seem reasonable or fair to expect even the most attentive and lightning-reflexed driver to take control and compensate for that system failure before disaster occurs. So, muddy waters, legally.

Parsko
Parsko
1 year ago

V2X

Thank you for bringing this back up again. This, IMHO, is by far the single most important safety solution we can possibly implement. I like the way it’s suggested here too. If I had 3 more seconds to respond to events ahead of me, the number of crashes is going to plummet. Not disappear, but plummet. This is the first time i’ve seen it be suggested as not active, but rather just an alert and a sound. I like this as a first implementation. It still keeps 100% of the control on the driver. This is going to be important while people are still coming to grips with relinquishing control of their car over to computers.

If even 25 or 50% of all cars had a simple ping implemented, it would still do the work to alert the other drivers on the road, passively. If a driver panick stopped in front of you, and 25% of cars had this system, presumably only the first 3 cars would be at risk before the alert communicated to driver 4 (and subsequently to drivers 4+).

After time and confidence build, add more to the system. Every “upgrade” would improve vehicular safety in good leaps. More than incremental, but less than the shock of fully implementing these systems with no transition.

I’m so ready for this. I can reduce my gap from 1-2 seconds to 0.5-1 second and still maintain a level of safety I am comfortable with at highway speeds.

DadBod
DadBod
11 months ago
Reply to  Parsko

Just imagine how the anti-vax election-denier crowd is going to respond to V2X

Parsko
Parsko
11 months ago
Reply to  DadBod

Roughly equivalent to complaining that your clothes dryer lets you know it’s done drying your clothes!

DadBod
DadBod
11 months ago
Reply to  Parsko

Don’t bring reality into this

Parsko
Parsko
11 months ago
Reply to  DadBod

🙂
Sucks having a brain, doesn’t it. All that extra work.

DadBod
DadBod
11 months ago
Reply to  Parsko

“The world is made for people who aren’t cursed with self-awareness.”

Parsko
Parsko
11 months ago
Reply to  DadBod

ROFL, stop, you’re killing me.

Amschroeder5
Amschroeder5
1 year ago

This just in, trucks are bad passenger vehicles. Weird. Almost like there should be restrictions and regulations around that. Weird.

Rust Buckets
Rust Buckets
1 year ago
Reply to  Amschroeder5

Does there need to be regulations limiting using pickups as passenger vehicles? It was a non issue for the first 100 years of pickups existing.

Maybe we need to get rid of the regulations promoting pickups as passenger vehicles and it’ll sort itself out.

Amschroeder5
Amschroeder5
1 year ago
Reply to  Rust Buckets

It has been an issue for decades tbh, and has only gotten worse.

Rust Buckets
Rust Buckets
1 year ago
Reply to  Amschroeder5

Decades? To me it doesn’t seem like it was an issue at all until it started with the 1997 SuperCrew f150. Even then it was another 8 years before crew cab family pickups were all that common. I think this issue has existed for 15-18 years now, but most of the severity has come in the last 6 years or so.

Amschroeder5
Amschroeder5
1 year ago
Reply to  Rust Buckets

Extended cab pickups have been child ferrying mall warriors for more than 20 years (including 90s kids such as myself), but yes, it is absolutely true that the proliferation of crew cabs has made it more evident. Not that the body on frame SUVs (and precursors) from the 80s/90s were really all that different or safer, with rollover risks that might well have been worse.

Baja_Engineer
Baja_Engineer
1 year ago
Reply to  Rust Buckets

The SuperCrew debuted in 2001. And Crew Cab sales took off about 3 years later with the 11th gen F150. By then every automaker already had a half ton Crew Cab pickup on their lots

Rust Buckets
Rust Buckets
11 months ago
Reply to  Baja_Engineer

Thanks, I wasn’t sure that the crew cab actually came out with the jellybean in 97

Ben
Ben
1 year ago
Reply to  Amschroeder5

Citation needed. It’s not clear that any other vehicle type would do better. The point of the article is not that trucks are bad, it’s that automakers design vehicles to pass the existing tests and when you add a new one they tend to do poorly at it. The same applies to smaller vehicles too.

Amschroeder5
Amschroeder5
1 year ago
Reply to  Ben

NHTSA estimates accidents with light pickups are 23% more deadly than other passenger vehicles, as well as 3x the rollover rate, which is the deadliest accident class as a whole.

AceRimmer
AceRimmer
11 months ago
Reply to  Amschroeder5

Weird, I’m 4 days late to this and no V10guy to shoot you down and sing the praises of trucks being perfectly reasonable DDs.

MrLM002
MrLM002
1 year ago

IIHS says we need more single cab pickups!

Taco Shackleford
Taco Shackleford
1 year ago

Manufacturers should be responsible, however this opens a whole can of worms that will end up with consumers still being responsible for the costs of said car crashes. If manufacturers are to blame, they will need large insurance policies for each vehicle equipped with self driving. These insurance costs will be passed on to the customer, who may also need to have insurance in the state they live to be able to register the vehicle. The company making self driving may be “liable” but customers will foot the bill entirely.

Drew
Drew
1 year ago

Who should be responsible for a car crash with a driverless car?

The regulatory body that approved it?

Seriously, though, that’s going to require a legal framework that I don’t have faith in. We’re likely to end up with all sorts of weird things going on. There’s going to be overlapping responsibility with:

  • manufacturers who built the things to operate within the given parameters
  • owners who may fail to ensure the vehicle is updated, in good repair, and in operation where/when/how appropriate
  • operators who choose to turn on systems in conditions outside normal parameters
  • those who maintain signage and roadways, if signs and markings are allowed to be obscured or degrade
  • other drivers/outside influences, if they would be liable in other situations (this includes things such as exploiting a glitch with imagery that will fool the computer, though that liability should be shared with the manufacturer)

The terrible spot we’ll be in will be these groups all pointing fingers at each other. We’ll need fair and unbiased agencies reviewing each incident and taking all variables into account. All data should be available without potential for manipulation (stored locally on the vehicle, ideally). If it must be stored elsewhere, there should be strict controls and audits to ensure no one manipulates the data.

All investigations should be compiled in a centralized reporting agency that can track patterns and require companies to address any recurring issues or vulnerabilities found.

Geoffrey Reuther
Geoffrey Reuther
1 year ago
Reply to  Drew
  • Owners/operators who modify the system or disable components within the system, or otherwise tamper with the vehicle in a way that affects driving dynamics or autonomous systems.

Doesn’t matter how much manufacturers lock down their hard/firm/software, there will always be someone out there that figures out a way to modify it because they don’t like X or want it to do Y.

I bring this up because of the sheer number of people who cut or unplug the speakers on their EV to kill the low speed pedestrian warning system because “it wakes up the neighbors in the middle of the night” (Translation: “I just don’t like it, and this is the thinnest veiled excuse I could think of.”)

Drew
Drew
1 year ago

Yep, I knew a guy who bought an EV and immediately cut the speakers. Not, like, try them to see how much the sound impacted him…just cut right away, because he didn’t think he should have to keep them. I’m sure he’d try to make a self-driving vehicle ignore parameters, because he also already made sure his DJI drone would ignore geofencing.

V10omous
V10omous
1 year ago

Who should be responsible for a car crash with a driverless car?

I feel as though this should be answered by thousand dollar an hour consultants rather than little old me in a comment section but I’ll take a stab at some simple guidelines:

-If the self-driving system is manual, ie you must turn it on or off, the driver is responsible for ensuring that it is only used in conditions safe enough to foreseeably prevent a crash. Not when snowing, not when icy, not when towing, not in heavy traffic, etc. depending on the capabilities of the system. These capabilities/warnings must be printed in the owners manual and visible on the screen of the car before using.

-If the system is automatic, and/or if it has been turned on in a safe situation as outlined above, the manufacturer/programmer is responsible for at-fault accidents.

-If a crash involving a self-driving car occurs, the same forensics to determine fault must take place as they do between two human-driven vehicles.

-Crucially, any monitoring software must be at least readable by third parties in the aftermath of a crash. In no cases will an OEM be allowed to determine or contest their own fault in a crash simply by reading the software internally.

Last edited 1 year ago by V10omous
Parsko
Parsko
1 year ago
Reply to  V10omous

I agree with this, completely. Also, in theory, the number of crashed should drop, so that 40,000 should become 36,000 in year 1. As such, the litgation should drop as well.

ElmerTheAmish
ElmerTheAmish
1 year ago
Reply to  V10omous

I like that last point, which commonly isn’t covered. There could absolutely be laws that cover information sharing from the OEM if they want to advertise & sell an AV.

Thinking about point one: Last year I bought my first car with adaptive cruise control. On one of the few snowy days we had last winter, we got a type of heavy, wet snow, that covered the main sensor, and rendered my adaptive cruise inoperable; I get why that choice was made. I also wanted to turn on “regular” cruise, just to learn the system. I was shut out from even that.

If my car has some sensibility programmed in to tell me “hey, I can’t adaptively cruise for you, which means you probably shouldn’t be cruising at all” (which was appropriate in the conditions I was in), why can’t the OEMs work some of that programming in for safety’s sake?

Typing this out and thinking it through a bit, I’d bet if your basic framework were to proceed, OEMs would do something similar to protect themselves, if nothing else. As much as I believe the OEMs will need to be responsible if the car is in a self-driving mode, they shouldn’t have to account for idiots using self-driving when the car obviously can’t handle it. What’s the line? “The problem with designing something to be completely foolproof is that one naturally underestimates the complete fool.”

V10omous
V10omous
1 year ago
Reply to  ElmerTheAmish

they shouldn’t have to account for idiots using self-driving when the car obviously can’t handle it. 

I think this needs to be handled by the vehicle being able to detect unsafe conditions and refusing to go into self-driving mode. Luckily, this aligns with manufacturer incentives to reduce liability, so will probably appear sooner rather than later.

StillNotATony
StillNotATony
1 year ago

Who is responsible when a driverless car crashes? Easy. The manufacturer.

The problem only arises when there IS a driver that theoretically COULD be in control. At that point, I’d still say most of the liability lies with the manufacturer, especially if the system us in ANY way described as “self driving”.

RustyBritmobile
RustyBritmobile
1 year ago
Reply to  StillNotATony

Easy – the operator (driver, pilot in command…). A system for dealing with crashes in this way is in place. The current system assesses blame (there will be some driverless car crashes that are not the fault of the driverless vehicle and are not preventable), allocates cost, etc. Arguable cases (see StillNotATony, above) will be settled within an existing framework with existing methods. Operators will of course continue to need insurance, insulating them financially. Importantly, this WILL penalize manufacturers who build unsafe self-driving vehicles – insurance rates for their cars will be higher (as a result of insurers analyses of self-driving performance), people will be hesitant to buy cars that crash a lot, and you can always sue or negotiate compensation for a ‘manufacturing defect’ – as you can now. The existing system works (at least sort of); replacing it with one that is entirely new will be a nightmare.

Wally_World_JB
Wally_World_JB
1 year ago

If your friends don’t read, then they’re no friends of mine

Data
Data
1 year ago
Reply to  Wally_World_JB

I feel like this has to be from the classic “Wierd Al” song The Brady Bunch.

https://www.youtube.com/watch?v=WkFMMCEa5AE

52
0
Would love your thoughts, please comment.x
()
x