Comments

You must log in or register to comment.

Chippopotanuse t1_j95dj4t wrote

Weird how Teslas keep crashing in to emergency vehicles. Happens so frequently that NHTSA launched an investigation:

https://www.autoweek.com/news/green-cars/a37425353/another-tesla-hits-police-car/

> The latest Tesla crash into a first responder vehicle comes just two weeks after the National Highway Traffic Safety Administration opened an investigation into 11 instances of Tesla drivers hitting parked emergency vehicles while using the Autopilot driver-assist system in the US. The incidents date back several years and follow a surprisingly common pattern: First responder vehicles, including police cars and fire trucks, stop to assist disabled vehicles on road shoulders or traffic lanes, using emergency lights to direct traffic around them, as Tesla vehicles with Autopilot engaged collide with them either with or without attempts by the driver to brake in the seconds prior to impact. Some crashes have resulted in serious injuries, as they happened at highway speeds.

195

NLJeroen t1_j95h2ck wrote

Probably because emergency vehicles are not around that often to have significant impact in the learning models.

77

Agent_Angelo_Pappas t1_j95u8sw wrote

Except other automated systems like SuperCruise and ProPilot and whatnot don’t seem to have this same issue. Tesla automation is disproportionately hitting emergency vehicles with respect to how many systems are in the market

77

Koksny t1_j95utd4 wrote

Decent developers and engineers no longer want to work for balding manbaby, underpaid and overworked.

Gig for Mercedes is much better than slaving for the archcunt.

84

TenderfootGungi t1_j9704bx wrote

Possibly, but they have had several incredible people leading the program. It probably comes down to Musk insisting they do it with cameras only. Everyone else also has LiDAR.

16

schiffty1 t1_j96uwwo wrote

Oh no, you've angered the muskrat horde.

15

VegasKL t1_j96xz98 wrote

>SuperCruise

That's also different technology afaik. I think GM maps various roads with lidar vehicles and then those maps get loaded into the vehicles for cross-referencing to their position -- done this way so they don't have to have a bunch of LIDAR units on the vehicle processing in real time. They likely have some forward facing LIDAR or Radar (or both) units.

Elon wants to be cheap and do it solely with cameras.

8

razorirr t1_j95x698 wrote

Are they though?

In 2019, an estimated 2,500 vehicles crashed into firetrucks parked as blockers (6.8 crashes every day or 16% of all firetruck collisions).

https://www.workzonebarriers.com/emergency-response-firetruck-collision-crash-facts.html#:%7E:text=In%202019%2C%20an%20estimated%202%2C500,of%20all%20traffic%20fatalities%20nationwide

Tesla has had around a dozen but its over 5 years.

Theres around 2 million teslas, and they all have AP at this point. 248 million cars total. .8%. 2/2500 is .08%. So tesla is 10x better than everyone else.

Also i feel you dont hear about the others because their systems are in an insignificant amount of cars, and usable on an insignificant amount of places. Once they scale to "yeah it works everywhere" it will go up

−30

Agent_Angelo_Pappas t1_j967fmm wrote

https://www.nytimes.com/2022/06/15/business/self-driving-car-nhtsa-crash-data.html

NHTSA makes manufacturers with automated assist systems in the market report crashes involving those technologies. Despite having only a minority share of ADAS in the market today, Tesla’s crashes represented 70% of the reporting.

27

razorirr t1_j9685bf wrote

Cant read, paywall.

how much of a minority, and how many miles do the others have? Like ford for example is all happy their has been active for 16 million miles recently, Tesla is around 3 billion since it came out adding about a billion a year. so every 1 tesla is worth 97 fords.

−17

razorirr t1_j98uvkc wrote

hahahaha. that report is a news article talking about the NHTSA report i got my 2 AP crashes from.

If you take the estimated miles driven for AP, and the estimated miles driven by everything else. AP has a crash rate of .0009 per 1,000,000 miles into all first responder vehicles, and that is assuming all 2 reported in that report were tesla. all cars overall broke out to .001 per 1,000,000 miles.

So forcing everyone to use AP would reduce crashes into parked firetrucks by 290 a year or 11.5%.

So if you want to use that article as a reason against ap, feel free, as its actually a reason to ban humans and use AP.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

Its page 7 of nine, second chart, shows ADAS crashes per vehicle it crashed into. Further. the data in that report has the possibility of double counts, so if Tesla reported it and the police seperately reported it. its 2x times. It also does not mean "The tesla crashed into me" If you rear end a tesla, and the tesla was on AP, it goes in the report, even though it was not the teslas fault.

1

nfollin t1_j98wgyn wrote

Bro, I just linked it for you, relax.

1

jayfrancy t1_j95zvzu wrote

How many other autonomous systems have had these accidents. That’s the apples to apples.

22

razorirr t1_j963cf0 wrote

How much do you want to actually read? I can answer this.

NHTSA has a standing order on ADAS crashes. All manufacturers are required to provide telemetry and report if a crash occurred with ADAS either on, or had been on in the previous 30 seconds. This reporting started July of 2021 and is still current.

You can read their june 2022 findings here. next report will be next june

In that whole time period, only 2 crashes were confirmed into first responder vehicles total, for any brand.

So every other article you have seen since june 21 through may15th 2022, the cutoff date in that report, is bullshit. its the press going "oh its a tesla and a responder vehicle, lets accuse AP/FSD, get a shitload of clicks from people on reddit, then not release a retraction months later when its found not to have been the cause"

As to my significantly insignificant bit. yeah, both crashes might have been tesla (the report does not break it down to that detail) But their system works everywhere, and is on way way more cars than Fords or GM. Ford was happy when they hit 16 million miles driven total. Teslas system does north of a billion a year. If tesla was both crashes on 1b miles, ford will have 0, and you can claim that "well ford is perfect" no, ford just has not had enough time to be statistically relevant.

The only other brand to have a significant amount of vehicles is Honda, with about 5 million, Their system however does not function everywhere so theres the question of are they better at not crashing, or do they just not have the crashes per mile figure out there as they have not released miles of usage figures. I can't do apples to apples with them as tesla has shown their apple, and the others all have a black box they say may or may not have a fruit in it.

−10

woody60707 t1_j95xg1d wrote

Look, no one has time to read your wall of text! Is Tesla bad, yes/no?

−21

razorirr t1_j95yrg5 wrote

Tesla great! They crash 10x less into firetrucks than all cars when figured for crashes per car. They just make the news cause dumbasses click then bitch

−21

TheLaGrangianMethod t1_j95zcce wrote

Does this account for the autopilot variable? Which is kind of what this whole thing is about? Tesla autopilot not seeing the first responders?

15

razorirr t1_j964ohq wrote

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

This report does. Manufacturers are required as of july 1 to have the cars monitor when the systems were on / off. If the car crashed with it either on, or on in the prior 30 seconds to the crash to report it.

From July 1 2021 - May 15 2022. Only 2 crashes total were into first responder vehicles. It does not specify which brand had it happen. but even if it was tesla for both, its probably inevitable. Tesla reports about 1 billion miles a year where the car is driving, Ford reported 16 million in a press release.

If we find that 1 crash in 500 million miles is the average, Fords 16 million miles is only 3.2% of that miles driven. Its not that ford never crashes, its that they have not done enough driving to hit the point at which it was statistically probable to have occurred yet.

4

SporkofVengeance t1_j95q0qn wrote

As the training set is basically using real cars as alpha/beta testers, that’s likely true for Tesla. Most other companies now are using synthetic data to train their AVs and so test a far wider range of scenarios.

19

VegasKL t1_j96ymhz wrote

It shouldn't be an issue of training data anymore, Tesla uses a lot synthetic (3d generated) data now so they can train the same exact scenario with a ton of variables swapped out over and over again. Nvidia (IIRC) did a presentation on the tech.

Remember, they also had this issue with box trucks if I remember correctly.

3

Raspberries-Are-Evil t1_j96hgmc wrote

Or because Tesla's are not "self driving." The driver is responsible for the car, and, this is no different than someone in a Honda hitting a truck and killing themselves.

−5

razorirr t1_j965fji wrote

Weird how you brought that up when this article does not even attempt to blame it on Autopilot / FSD.

Frankly, theres 2500 crashes like this a year across the whole vehicle fleet. 6.8 of them per day. So if that was 1 article, where are the other 5.8 articles? Oh wait it does not involve a tesla, so it wont make the news.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

There were from Jul 2021 - May 2022, 2 crashes that were proven to have been ADAS of any brand's "fault". And by fault I mean the NHTSA's order of "if the system was off, but on up to even 30 seconds before hand, it counts"

Tesla does about a billion miles a year right now. Ford was happy to put out a press release about their cars having hit 16 million total. So even if both of the 2 in that report is tesla, and we find out that its a 1 in 500 million miles driven average. 16/500=3.2%. Ford simply has not had enough usage to have their 1:500,000,000 happen yet.

11

bobjoylove t1_j96hyfd wrote

Nevertheless with ADAS this exact collision type should be 100% avoidable without extenuating circumstances (ice on the road, impact from another vehicle driven by a human). The reason it’s not is Tesla’s refusal to use ranging technology like Radar, and insisting on cheaper visible-light based cameras.

17

razorirr t1_j96oqwi wrote

Your statement shows you dont know how car radar works.

Cars are using the radar to measure doppler shift. This is how they tell if the car in front of you is moving faster, or slower than you. Because the speed of the signal is a known constant, it can also give you distance.

In the conditions you have driving, you have to throw out any measurement of something not moving, such as that parked firetruck and mark it as invalid. This sounds ridiculous but its for a simple reason

Pretend you are in a car with radar and you are driving down into a valley. The car will see the bottom of the valley where you would start driving up the other hill as a static object, and the car would stop. With radar, you cant tell this valley from a police car.

−3

bobjoylove t1_j96wdpy wrote

Your statement shows how you don’t know how software works.

You augment the camera with the RADAR. When the two diverge significantly the system will error and hand back control to the driver.

11

razorirr t1_j96yztn wrote

That wouldn't do anything

Like I explained, the radar in this case would give "All clear"

The camera in this case should have gave "firetruck" but gave "all clear"

Erroneous camera All Clear + radar design all clear = all clear = crash.

Camera Firetruck + all clear = stop

The radar all clear in this case is unneeded, as it will never be not all clear, and the diverging car stop is not needed because the firetruck car stop would apply.

From a QA guy telling the probably Developer guy your logic is bad, you could program the radar to always return blocked if it sees any static object. but then that causes a problem.

  1. If the radar says blocked, and the camera sees something, That is a stop due to agreeing.
  2. If the radar says blocked but the camera does not see anything, that is a stop due to divergence.

Your car would never be able to go anywhere in the system you proposed other than on an unblocked flat surface.

Love all the instant downvotes all my posts are getting. Seems a lot of people don't know what they are talking about but think they do.

−3

bobjoylove t1_j971ijk wrote

The Radar is used for ranging. It provides a distance and a rate of change over a reasonably narrow aperture. The bottom of the valley does not get close enough to warrant emergency intervention from the braking system.

The fact that the majority of cars with dynamic cruise and automated pedestrian braking systems all using 60GHz as the detection method should tell you it is possible and it is shipping already.

8

razorirr t1_j9731y8 wrote

>The Radar is used for ranging

Correct

>The bottom of the valley does not get close enough to warrant emergency intervention from the braking system.

Incorrect

You are driving down the hill, its a 1 mile slope from top to bottom, then it curves and goes up the next hill.

You are right that while its far away, you can ignore the read because the range is saying "yeah i see something, but its 3000 feet off, who cares" or it just sees nothing as its not looking that far out.

But since its not moving and you are, eventually you will be 200 feet from the bottom. Radar sees this as an object blocking your path, and its now close enough the car goes "Yeah I see something, its 200 feet away, lets stop."

Since the ground is never going to move, radar will always say stop. A camera with sufficient data labeling ability can overcome this as it can tell context, radar never can as it is a binary "block / clear"

Also, https://www.chevrolet.com/support/vehicle/driving-safety/brakes/front-pedestrian-braking heres Chevy explaining how they do their pedestrian braking. Its using cameras, not the radar.

5

bobjoylove t1_j977j8u wrote

Ok let’s agree to disagree on the technical aspects of a know working collision avoidance system they is shipping on millions of cars including my own.

It’s good to have a secondary system to cross-check the cameras. I have noted that many (not this one) cases of the Tesla systems failing have been at night. Adding RADAR or LIDAR augments the cameras. BTW the answer in the back of the book is Tesla have realised that they actually do need RADAR and have begun adding it. https://electrek.co/2022/12/06/tesla-radar-car-next-month-self-driving-suite-concerns/

6

razorirr t1_j978ckm wrote

No. This is a technical conversation about how a technical system works. You cant agree to disagree on those aspects else its impossible to come to an agreement at all. The only way to prove this would be for you to prove the car would not stop forever on the hill once the radar and the camera diverged if divergence = stop or in the case of pure radar = sees blockage = stop.

I agree augmenting is good. Like the radar can see the range of an object better than camera vision can guesstimate it. But what i was talking about is a known limitation to radar. You can not "Augment" around that, you have to throw the data out, and if you are throwing it out 100% of the time, you don't need it.

0

bobjoylove t1_j97ch2n wrote

Do you ever think that, even when provided with a link proving me right - specifically Tesla adding RADAR to fix their issue - and you still argue that isn’t the resolution; that you might just be stubbornly wrong?

3

razorirr t1_j97dmf4 wrote

Did you ever think that they could be putting the radar in to augment all the other situations where radar is helpful, but due to the limitations of radar, this is not one of those situations?

Actually read and comprehend that article. The OG radar my car has was insufficient compared to just cameras. The one they are putting in has much better distance that it can see, but it still will have the issues I've explained above as that is a fundamental issue with radar.

So now instead of seeing the bottom of the valley at 200 feet, it sees it at 400 feet. All of the same problems occur and the car still can not proceed to the bottom of the hill if programmed to always stop based off a radar blockage or a radar vs camera divergence. Radar always will be not helpful for static objects in path, but it will be really helpful for letting the car know something is in motion 400 feet away.

0

bobjoylove t1_j97ekv4 wrote

It’s clear they have reviewed the data and made a decision. If that’s not enough to convince you, then I’m not sure what more you might need.

Have a good day.

3

razorirr t1_j97g29v wrote

I will freely admit radar can help in situations. This situation is not one of them because of how radar works. You have convinced yourself otherwise and now refuse to correct your incorrect opinion.

Have a good one.

1

TenderfootGungi t1_j97ac0b wrote

>In the conditions you have driving, you have to throw out any measurement of something not moving, such as that parked firetruck and mark it as invalid.

That should depend on where it is. Is it on the side of the road? Not an issue. In my lane? Real issue.

It is telling that no other self driving tech is having trouble with this. Everyone else has this figured out.

3

razorirr t1_j9an80n wrote

You would think that but no. If you consider anything in front of you not moving as reason to stop, if i put you on a hill, your radar is now pointing down the slope, so as you approach the bottom your vision will tell you "im ok to proceed, its just the hill slope leveling off." Radar will tell you "oh shit theres a stationary object, brake now". Stationary objects in path is a limitation of using radar, which cant tell what the object is.

1

smoke1966 t1_j96ns58 wrote

Blinded by the light probably. also useless in snow and can't see a semi trailer against a white sky.

1

tapac333 t1_j96ehdu wrote

Emergency vehicles run red lights. More Teslas than other brand self- driving cars on ths streets, therefore probability of Teslas hitting vehicles that don't abide by traffic signs would be higher.

−8

Chippopotanuse t1_j96mwhw wrote

Okay…but these are emergency vehicles that are on the side of the road and stopped.

Literally the excerpt from the NHTSA report I pasted says these emergency vehicles were stopped on the side of the road to help folks.

The flashing lights on emergency vehicles confuse the Tesla AI. It’s been a known problem for years. Elon and his fanboys try to gloss over it or play whataboutism games to avoid having to address it in any substance.

4

razorirr t1_j96xtfb wrote

So, we don't have all the offical numbers for things but we can take a crack at this

https://lexfridman.com/tesla-autopilot-miles-and-vehicles/ Lex Fridman, a MIT Research Scientist has sat down with the sales figures and the AP miles driven numbers tesla has occasionally given out, and at the last update he posted, there would have been roughly 1.8 billion miles driven between 4-22-2020 and 1-1-2021. For a full year this is 2.662 billion miles. or 221.9m per month.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

NHSTA says over the course of 10 months, there have been two confirmed ADAS related accidents into first responder vehicles.

https://smartfinancial.com/average-miles-driven-per-year

If you take the 12785 miles per driver 2020 average which they calculated by doing the math against the 2020 FHA report. There are 228m drivers, so this is 243b a month

https://www.workzonebarriers.com/emergency-response-firetruck-collision-crash-facts.html

This report shows that 2500 trucks a year parked as blockers get hit. 250 per month

250 accidents per month / 243000 million miles = .001

.2 accidents a month / 222 million miles = .0009

So if you take all the different reports in context of each other. Non tesla AP hits .001 fire trucks per million miles driven. Tesla AP hits .0009 firetrucks per million miles driven.

Tesla AP is slightly better than all humans + other AP systems. If we replaced everything else with Tesla AP, we would have reduced the accident count by 291.5

−1

[deleted] t1_j95oz1a wrote

[deleted]

−34

Chippopotanuse t1_j95psyk wrote

How many of those cars claim to have “full self driving“ and have people doing everything from reading the newspaper on the highway to taking a nap while the car is going 70 miles an hour.

Sure, the fine print says “but there has to be a human ready to take control at all times!“

But there’s only one auto maker that brags about their self driving hardware. And so yeah, it is newsworthy when the one company who claims they can do it is failing spectacularly at it. And getting people killed.

38

[deleted] t1_j95qsbq wrote

[deleted]

−35

Chippopotanuse t1_j95rlgv wrote

They all had autopilot engaged. You can ignore reality if you’d like.

35

jenkinsleroi t1_j95tq03 wrote

Your statistic is meaningless unless it's normalized by number of miles driven. And I suspect the number of miles driven by non Tesla cars is way higher.

Plus autopilot should mean that they're much less likely to crash into a stopped vehicle on the side of the road.

−17

Chippopotanuse t1_j95uvbc wrote

It’s not my statistic. It’s the NHTSA investigating crashes.

If you think your hand waving is enough to overcome the legitimate concerns the NHTSA has here…please see if Tesla will hire you as their general counsel in charge of regulatory oversight.

18

[deleted] t1_j95xvvj wrote

[deleted]

−19

Chippopotanuse t1_j95zckj wrote

You are less informed than your think.

Elon talks about autopilot and full self driving interchangeably.

Go listen to him in 2019 when speaking with Cathie Wood’s Ark Podcast:

“My guess as to when we would think it is safe for somebody to essentially fall asleep and wake up at their destination: probably toward the end of next year. I would say I am certain of that. That is not a question mark.”

It’s 2023 now. Not 2020.

And Tesla isn’t anywhere near capable of having someone fall asleep and arrive at their destination in any safe manner.

Elon didn’t speak unequivocally back in 2019 about this. He didn’t say “hey we are working on something cool that might happen someday.”

He guaranteed it: “I would say I’m certain of that. It’s not a question mark.”

He has promised consumers for years that they can do these things.

He’s a liar and you’re being duped. And more than one person who believed his statements are now dead.

17

FrontierRoad t1_j95somg wrote

I was thinking the same thing I'm sure plenty of ford pintos are plowing into things but never here a word.

−13

Chippopotanuse t1_j969uy9 wrote

Can you put me to the press release where the Ford CEO guaranteed that Pinto owners could fall asleep behind the wheel and safely arrive at their destination?

3

wolf-bot t1_j95f2g2 wrote

Could have been avoided if there was a child standing on the side of the road.

87

buttergun t1_j95rotz wrote

The Florida State Legislature is debating a bill that would replace construction zone traffic cones with 8 year old children. It seems like they would work for emergency responders too.

32

WeArePanNarrans t1_j95zj31 wrote

About time someone recognizes the importance of putting children to work. The sooner they learn the value of labor the better.

14

VegasKL t1_j96z2pm wrote

Added bonus that their small stature and fingers can get into tight spaces! Plus, they'll work for literal peanut M&M's.

/Industry Rejoices

4

bobjoylove t1_j96hh5j wrote

The only problem is stacking them for transport.

3

buttergun t1_j96kr1c wrote

Just charter some jets and fly 'em in from Texas. Duh.

5

VegasKL t1_j96za8w wrote

Those planes only fly to democratic states.

1

Thegarbagegamer97 t1_j954b19 wrote

Thus is the problem with AI driving. It knows the rules, it knows how to respond to things that are following the rules, however, it struggles to compensate for when something goes against what it considers to be the rules. That requires on the spot thinking which we still seemingly have a ways to go before being able to replace humans for.

68

Perfect-Height-8837 t1_j957xuy wrote

This is the main reason I want to buy a Tesla. It doesn't even need to be in self driving mode, but the Media will report my death to the world just because I was driving a Tesla.
No other car manufacturer can offer this level of post-mortem notoriety.
You never read headlines such as "some nobody dies when his Skoda crashed into a firetruck."
But put Mr Nobody in a Tesla and he's worth reporting about.

90

writingt t1_j95n96j wrote

People crashing their cars has been happening for over a hundred years and as horrible as it is is nothing new. A car crashing itself and killing its driver is much more newsworthy.

30

Velocity_LP t1_j968du1 wrote

> A car crashing itself and killing its driver is much more newsworthy

are you confusing this article with some other incident? or did you find another source i must’ve missed? there’s zero mention of use of autopilot or anything else in this article that suggests the car caused the accident

11

Perfect-Height-8837 t1_j96zlk6 wrote

I think you may have fallen for the false assumption the title was leading you to. They want you to assume Autopilot only for you to learn it was just a bad driver. Therefore, Tesla is still good.

−1

LogicisGone t1_j95o6o7 wrote

I mean my local news pretty much covers every car accident with a death and I don't remember a single instance of anyone hitting a parked fire truck ever. Let alone also killing someone and injuring 4 firefighters. I would actually call this newsworthy.

12

Perfect-Height-8837 t1_j96zahq wrote

Do they mention the make of car in the headlines?
Usually not.

Funny thing is, Murdoch's son is on the board of directors at Tesla.

1

jedi_trey t1_j95sudx wrote

Well that's some hard hitting data

−2

hwangjae45 t1_j95flpf wrote

https://youtu.be/jiKzoO3tuSw

They say in the video they do not know if auto pilot was on, I’m not a Tesla advocate or anything but let’s wait until the facts are out.

16

Neospecial t1_j95gjji wrote

Isn't it always "OFF"? As in intentionally turning itself off second before a crash to avoid liability? I don't know and don't care to find out, just something read or heard somewhere at some point.

I'd not trust an AI driving me regardless.

18

hwangjae45 t1_j95gyg8 wrote

From what I know Tesla cars have a record of when it turns it’s auto pilot on and off, and from what I’ve seen it seems I think that it records that it is on. With that said I think Tesla had a recall due to their auto pilots, so it does seem to be a huge problem.

2

razorirr t1_j96c9l7 wrote

Nah. NHTSA requires reporting of all accidents up to 30 seconds after it turns off.

So if you think its turning off to not get counted, that means you think its not able to avoid crashing, but is able to realize its going to crash a half mile up the road, turn itself off, which it notifies you its doing, then the driver ignores the minority report self turn off, does not take over, and crashes.

2

TenderfootGungi t1_j97bg88 wrote

They were caught turning it off a split second before most crashes, and then stating something like "the auto pilot was not engaged". In many cases it was, less than a second before the crash, though. They has now started asking if it was engaged so many seconds before a crash (e.g. 10 seconds, but cannot find the exact time).

−1

GarbageTheClown t1_j97l6q1 wrote

You have a source for that? For as long as I remember they count anything within the last 5 seconds, it's on their website.

1

JohnPlayerSpecia1 t1_j95i94u wrote

not to worry, Tesla black boxes will always "turn off" autopilot just seconds before any crash to shift blame away from Tesla.

7

ryan_m t1_j95urf9 wrote

They absolutely do not and this gets repeated constantly. Tesla counts any crash that happens within 60 seconds of AP/FSD disengaging, which is longer than the NHTSA requires.

−8

code-sloth t1_j95z1ur wrote

It gets repeated constantly because it's true.

https://www.washingtonpost.com/technology/2022/06/15/tesla-autopilot-crashes/

> Tesla's vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

4

ryan_m t1_j95zsyt wrote

Read the claim I responded to fully and then read what you posted. The first half is true that it turns off, but the core of the claim (that it is done to shift blame away) is entirely bullshit, because the cutoff for reporting is 30 seconds, and Tesla counts a minute before.

It makes sense that autopilot will shut off before a crash if you think about it for more than a couple of seconds. What behavior do you want a system like that to have when it encounters a situation it can’t handle? It should alert the driver and disengage. If you’re being a responsible driver, you should be paying attention the entire time anyways and ready to take control to specifically avoid things like this.

The anti-Musk circlejerk has gotten so insane at this point that people are no longer thinking about what they’re saying.

6

Raspberries-Are-Evil t1_j96hp1d wrote

It doesn't matter if driver was using autopilot. The driver IS RESPONSIBLE. Tesla's are not "self driving." Self driving is not legal yet- the driver IS responsible.

1

smoke1966 t1_j96okqm wrote

always number one problem with programing: making the list of EVERY possible scenario that could happen. You have to think of anything that could possibly happen ahead of time.

2

Walui t1_j974x5s wrote

Lol that's the whole point of machine learning. You have no idea what you're talking about.

1

smoke1966 t1_j9790e7 wrote

if it is programmed to learn correctly.. I've done programming and there's always the one thing you forgot.. If you don't believe that it's just a prime example of the problems with these cars.

1

GarbageTheClown t1_j97lf96 wrote

ML and traditional programming are not really in the same bucket.. and no, it doesn't work like that.

1

whyvas t1_j956v73 wrote

Where does it say the car had full self driving enabled?

−14

kuahara t1_j956wrq wrote

AI was not navigating this vehicle; a human was.

Also, we didn't need to know the make of the vehicle. The outcome would have been the same in a Chevy, Ford, BMW, etc... reporting agency is trying real hard for a specific reaction to a completely irrelevant detail.

−29

Thegarbagegamer97 t1_j957au6 wrote

Unless you have a source, all articles i can find point to it not being said one way or the other currently

27

niceguybadboy t1_j955p98 wrote

But...for every dumb mistake an AI driver makes, human drivers probably make fifty.

−32

Thegarbagegamer97 t1_j95684x wrote

Likely so, but in cases of mistakes like this, if the self driving was in use, i have to go with human drivers simply for the ability to make sense and at least attempt to navigate/slowdown/stop in response as needed. The AI seems in the beginning stage of development where it says “hmm, that looks like road so full speed ahead”, which most people, assuming they aren’t distracted, or under the influence of some substance will see flashing lights and a big object blocking the road and try to navigate around the firetruck, not plow through it like their in some action film. Some day im sure self driving will get there, but we are a LONG way off

7

ViciousNakedMoleRat t1_j958sau wrote

With automated driving the question is simply: How much more do we value an overall reduction of crashes compared to having to live with crashes that a driving human would've easily avoided?

On a societal level, we should theoretically be in favor of self-driving cars as soon as they cause fewer crashes than human-operated cars – even if it's just a couple of percent.

However, on a personal level, it probably takes a much more significant rate to convince many individual drivers. That's because a vast majority of drivers thinks of themselves as above average.

The perceived stupidity of automated driving accidents, like driving straight into objects or coming to a stop in the middle of the road makes them particularly likely to be picked up by the media, which raises the exposure of people to these issues. The hundreds of daily crashes caused by inattention or other human error just slide by without being noticed.

This causes a similar situation as fear of flying. It's much safer to fly than to drive a car, but plane crashes become huge news stories, which causes some people to develop an irrational fear of flying, while having no issue with driving.

−9

Thegarbagegamer97 t1_j959416 wrote

Self driving will be a wonderful thing one day, but when it has the potential to break laws by ignoring the rules of certain areas of roadway, and to plow straight on into a stalled out or stopped vehicle like theres nothing there, i think ill hold off on it just a little longer and keep my personal judgement capabilities. Humans arent perfect drivers, i dont expect AI to be either. But id rather prefer not having to babysit the entire time simply because it can have a tendency to suicide rush a fire truck or go straight from a turn only lane.

12

[deleted] t1_j95qlzj wrote

[deleted]

58

redander t1_j974ul8 wrote

Then you have the cops who are being ridiculous and arresting those firefighters. Yet it's the safest way for them to park.

6

WelcomeScary4270 t1_j98h7da wrote

That's been in the news once. It's not like it's happening every time an engine turns out.

9

Sol_Invictus t1_j95lec1 wrote

Maybe the Head Tweeter will adjust the algorithm.

21

MidwestAmMan t1_j95ptb1 wrote

We need to know if it was using FSD

13

Raspberries-Are-Evil t1_j96hs56 wrote

Why does it matter? The Driver is responsible.

5

BasroilII t1_j96kk39 wrote

Absolutely agree that ultimately it's the driver's responsibility.

It's more to shut up the nits that are making this 100% about self-driving (or just dogpiling tesla) without knowing the actual cause.

2

vbob99 t1_j9eyxld wrote

Or those defending self-driving, assuming it was off in this crash without knowing so.

1

BasroilII t1_j9gnava wrote

Given the track record that is the more likely. However, I'm willing to wait and see what the actual fault was either way.

1

vbob99 t1_j9i751t wrote

I'd say it's about equally likely both ways.

1

MidwestAmMan t1_j978fza wrote

It’s a sticky wicket tbh. The Tesla-over-the-cliff all survived story was incredible. Teslas are clearly much safer on average. But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

If humans are a greater risk than FSD maybe FSD can be modified to require the driver take over when approaching emergency vehicles. But we need to know if FSD was engaged here.

1

Raspberries-Are-Evil t1_j97fv7z wrote

> But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

As a Tesla owner myself, I understand that I am in control of the car at all times. This is no different than some idiot on cruise control slamming into a stopped car in front of him.

FSD requires your hands to be on the wheel. In fact, every 30 seconds or so, it remind you and if it doesn't detect your hands on the wheel by making a slight move to the wheel, it will disengage.

So even IF driver was using FSD, its his fault for not slowing down when approaching a fire truck.

3

GarbageTheClown t1_j97kt4n wrote

if FSD knew that it was approaching emergency vehicles then it would know it needed to stop. The problem is it doesn't know it's approaching emergency vehicles.

2

WirelessBCupSupport t1_j98kopd wrote

I watched this on the news. They couldn't determine as the driver died on the scene, but the passenger was alive and lifted to the hospital. And the firefighters that were there, said this isn't the first time they were hit. And, why dealing with the crash, they almost got hit again!

3

razorirr t1_j966747 wrote

There have from June 2021 through may 15th 2022 been 2 crashes with ADAS of any type, from any brand into emergency vehicles.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

So you can either believe the government which is forcing the makers to publish this data that this is a non issue. Or you are just here talking about it cause you don't like autonomous vehicles and you are being dishonest with your comment anyways.

0

YEETMANdaMAN t1_j96baua wrote

I love how every time you post that link people stop replying to you. I thought everyone trusted the NHTSA’s collision and recall data?

−1

razorirr t1_j96cpv8 wrote

people only trust government reports when it backs what they think. the second it doesn't, they don't trust the government.

−2

thisismynewacct t1_j95upo6 wrote

Terrible situation but at least the only fatality was the driver, who should’ve been paying attention and not the passenger or a firefighter just out doing their job.

3

Osama_bin_laughin t1_j9awrxj wrote

Okay but was the Tesla self driving or not? Doesn't say in the article. If not then this article is some anti Telsa non sense

2

KnucklesMcGee t1_j96pet0 wrote

Miraculously, FSD disengaged seconds before impact.

1

razorirr t1_j96w1v9 wrote

Miraculously, NHTSA requires any disengagements up to 30 seconds prior to be reported as if it was on.

So good on you for thinking tesla can predict it will hit a parked firetruck 31 seconds prior, yet cant figure out how to not hit said truck in 31 seconds.

2

xnago_tyr_sires t1_j97wug0 wrote

Why is it always in Walnut Creek that some one does something stupid in a Tesla?

1

iamaredditboy t1_j96g1bh wrote

Teslas need to be banned period till they turn off self driving on all their vehicles. No one knows when self driving is engaged and Tesla drivers are worse then break my windows drivers…

−1

MidwestAmMan t1_j97b57c wrote

The tricky part is even with anomalies it is still safer than sleepy, texting drivers. But it should require driver takeover in situations it’s poor at.

1

iamaredditboy t1_j97ipxr wrote

That’s a pretty lame excuse to make for how unsafe it is. Sleepy texting drivers does not legitimize introducing something new that is known to be unsafe on the road.

−2

ariceli t1_j98k3en wrote

I guess other makes of cars don’t crash except Teslas.

−1

Mr_Mons_of_Nibiru t1_j95pery wrote

And we come to it. The necessary sacrifices that need to be made in the name of progress.

Can't wait for all those automated semis to hit the road.

−10

UsedToBsmart t1_j9552c6 wrote

I thought Tesla fixed this issue?

−12

ImoJenny t1_j955h8n wrote

21

UsedToBsmart t1_j956cyi wrote

I was thinking specifically about crashing into emergency vehicles. It looks like they said they fixed it:

https://www.autoweek.com/news/green-cars/a37694444/tesla-autopilot-will-now-detect-emergency-lights-but-only-at-night/

Unfortunately that may be another Tesla lie.

16

diezel_dave t1_j95x4o2 wrote

Mine would dangerously decelerate when it saw emergency lights flashing. Didn't matter where they were though, just any lights flashing anywhere vaguely ahead of you would cause immediate and unexpected hard braking. I always feared someone would rear end me or road rage me for brake checking them. So glad I sold that thing.

7

KnucklesMcGee t1_j973xc4 wrote

Elon says a lot of things.

Once in a while you might get something true, but this wasn't one of those times.

3