Viewing a single comment thread. View all comments

pitchforksNbonfires t1_j1wpta1 wrote

Driverless vehicles

Governor Wolf signed a bill in November that allows for the regulation and operation of “highly automated vehicles with or without a driver.”

“This technology brings the potential for significant advancements in vehicle safety and mobility, and offers economic development benefits across Pennsylvania,” Wolf said in a signing letter.

Just four months earlier...

https://www.govtech.com/fs/nhtsa-releases-new-data-about-autonomous-vehicle-crashes

Vehicles with driver assistance systems and autonomous technologies have been involved in hundreds of crashes in the last year.

Newly released data from the National Highway Traffic Safety Administration (NHTSA) details when crashes occurred in vehicles equipped with Advanced Driver Assistance Systems (ADAS), the driver assistance features found on many cars, and Automated Driving Systems (ADS), which refer to autonomous technologies being tested — and in some cases deployed — on public streets and roadways.

Vehicles with ADAS have been involved in 392 crashes in the last year, according to the federal highway safety agency. Six of those were fatal, five resulted in serious injuries, with 41 resulting in minor or moderate injuries. Four involved a “vulnerable road-user,” such as a cyclist or pedestrian.

——————————————————-

The new PA law does not make our roads safer - it does exactly the opposite.

18

cardboardunderwear t1_j1xl54j wrote

>The new PA law does not make our roads safer - it does exactly the opposite.

Thats only true if the technology in question is more dangerous than people driving without it. Its not clear to me in the article that thats the case.

18

pitchforksNbonfires t1_j1xus8s wrote

That there are currently several NHTSA safety investigations into different Tesla models indicates the seriousness of concern about self-driving vehicles.

In the instances below, it appears that the technology in question was indeed more dangerous with it than without it:

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

—————————————————

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

−2

Weary_Ad7119 t1_j1yj9bo wrote

And you think human drivers don't do this? Pointing out individual events is worthless.

6

pitchforksNbonfires t1_j1zobq4 wrote

Of course drivers are fallible. There’s never been a question about humans making mistakes, being distracted, etc.

The argument is whether technology - sensors and computers - can take the place of human senses and judgement.

Is there a difference when a distracted driver hits a wall, vs a technology-assisted vehicle doing exactly the same thing because the computer took a crap, or the vehicle was hacked?

There’s no difference.

Maybe there is a difference. A driver-driven vehicle can, at the last minute - and if the driver regains alertness - possibly avoid a collision. A computer-driven vehicle may not have the same ability. We’d have to have confidence that the computer could recover as quickly as the human could. And we don’t know that it can. There are instances - described in the article links that I posted, saying that the computer doesn’t recover, and the accident happens.

1

Misbemisbe t1_j1wueui wrote

ADAS is not the same as Autonomous Vehicles

11

pitchforksNbonfires t1_j1xv6nv wrote

True.

Drivers should always have the option of disabling optional features that they don’t feel comfortable with.

Sensors and computers can and do malfunction. They can be hacked. It is the responsibility of the driver to determine what, if any, comfort level they may have with technology that has the potential of interfering with their ability to control their vehicle.

2

AdlerFMT t1_j1wvrjo wrote

While I dont particularly like self driving/ level x autonomy/ electric/ whatever you want to call it and the nannies that come alone with it, I could play the devils advocate. Because it depends on how you look at the data. While only my unprofessional opinion, please consider the following...

​

If you look at it this way, 392 crashes and 6 fatalities is a fairly small number when the number of roadway deaths is in the tens of thousands.

"The National Highway Traffic Safety Administration today released its early estimates of traffic fatalities for the first half of 2022. An estimated 20,175 people died in motor vehicle traffic crashes, an increase of about 0.5%..."

​

I also managed to locate a PDF from nhtsa showing a large number of ADS cars were rear ended. Take that information as you will but it would be hard to squarely put the blame on ADS cars on that.

I absolutely think these things should move more slowly and it looks like so does nhtsa, but Id think just straight up saying ADAS cars are dangerous isn't really the case.

10

pitchforksNbonfires t1_j1xsz52 wrote

ADAS vehicles are here to stay, although I’ve read accounts of drivers being surprised/alarmed at how the vehicles react at certain times.

Driverless vehicles are downright scary.

While there are bad drivers on the road, the selling point of this technology is that it is less apt to be fallible in certain circumstances than a human being.

Onboard computers can and do malfunction. They can be hacked. They are fallible, no less than a human driver. Sensors and a computer can’t take the place of eyes, ears and (hopefully) an informed, experienced driver.

The NHTSA article doesn’t mention how ADAS/ADS vehicles factor into the data, though there are currently several NHTSA investigations into some Tesla models and their Autopilot software. There have been accidents, injuries and fatalities.

——————————————-

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

As far as the prevalence of ADS vehicles being rear-ended, some could be due to sudden and unexpected braking, as happened on Thanksgiving in San Francisco on the Bay Bridge:

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

0

NotNowDamo t1_j1wyxw9 wrote

That is only one data point. It doesn't tell us if ADAS is safer or less safe than driving without ADAS.

6

cpr4life8 t1_j1xbq6i wrote

Agreed. Of course my only comment when I mentioned it was "nooooooo" and people who apparently don't give a fuck about motorcyclists or bicyclists or pedestrians disagreed.

3

Super_C_Complex t1_j1ylt39 wrote

There's over 6 million car crashes a year.

Yet 392 makes our roads less safe? Got it. Yup

2

chickenonthehill559 t1_j1wvunw wrote

Facts would differ from your opinion. Driverless cars are less likely to be involved in an accident. There are too many bad drivers on the roads today.

1

pitchforksNbonfires t1_j1xu08q wrote

Facts would differ from your opinion.

The various NHTSA investigations into Tesla’s Autopilot software indicates that there are many safety concerns about ADS vehicles.

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

————————————————-

And last month’s Thanksgiving day 8-car pile up on the Bay Bridge in San Francisco:

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

−1

chickenonthehill559 t1_j1xvsqy wrote

Agree driverless is not perfect. Compare this to how many DUI are recorded every day. Add in the number of dumbass drivers who are distracted by their phone. But hay there were a handful of driverless crashes.

4

pitchforksNbonfires t1_j1xyspp wrote

I’m not looking forward to sharing the road with either occupied autonomous vehicles or driverless ones.

If an impaired driver is in full self-driving mode, would they still get a DUI? Very likely yes, because they’re still supposed to be alert enough to oversee or supervise the self-driving function. And they always have to be ready to take over driving. In these instances an argument can be made that the self-driving function can be safer (once/if they work the bugs out).

Drivers who are distracted by their phones when actually driving their vehicle will likely do exactly the same thing when in self-driving mode. But again, they’re supposed to be constantly monitoring the screen for real-time data on the car’s operating functions. Same argument as the DUI.

Technology is fallible. We’ll see.

1

OhioJeeper t1_j1z7rfm wrote

> If an impaired driver is in full self-driving mode, would they still get a DUI?

Why should anyone care if they do or don't get a DUI? Is this about punishing alcoholics or saving lives?

>Drivers who are distracted by their phones when actually driving their vehicle will likely do exactly the same thing when in self-driving mode. But again, they’re supposed to be constantly monitoring the screen for real-time data on the car’s operating functions. Same argument as the DUI.

Same question, do you care more about punishing distracted drivers or saving lives? These are two separate concepts, law enforcement is not the same as accident prevention technology even if they might be attempting the same end result.

>Technology is fallible. We’ll see.

So are people, as you're demonstrating now. This is something that's already been heavily studied:

https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety

I was pretty fortunate to be taking a law and ethics class as part of my master's with someone who's close family member was killed when their Tesla collided with the side of a semi truck that made an improper turn. The story was widely publicized and what got reported was so far off from what any official investigation found actually happened based off of evidence. This was all at least 5 years ago, and there's been a ton of research done by NHTSA on the tech making a strong case for exactly the opposite of what you're saying.

I'm genuinely curious if you're sitting on some credible evidence of cars with automated driving tech actually being more dangerous than those without, because from where I'm sitting it seems like you're going off the same type of "logic" people use to argue against seatbelts saving lives.

1

pitchforksNbonfires t1_j1zsmdv wrote

Why should anyone care if they do or don't get a DUI? Is this about punishing alcoholics or saving lives?

I’m fairly certain that the point of citing and punishing impaired drivers is to make roads safer in order to save lives. Meaning that the “saving lives” part is the primary, not secondary purpose of DUI (and distracted driver) laws.

My points about DUI and distracted driving are relevant as they would directly affect the operation of technology-assisted vehicles, from both a safety and law-enforcement standpoint.

The primary purpose of traffic law is public safety. To say it a different way, the concept and development of traffic laws were borne out of the need for public safety. Is there any question about this?

What flows from that original purpose - the punitive nature of law enforcement - in no way reduces or alters the original and primary intent of traffic laws.

The NHTSA article you link does indeed extol the virtues of technology-assisted vehicles. Interestingly, the agency continues to have numerous investigations into safety problems with technology-assisted vehicles. Weird dichotomy.

The story was widely publicized and what got reported was so far off from what any official investigation found actually happened based off of evidence

A personal anecdote has no validity unless you can provide some specifics or documentation.

This was all at least 5 years ago, and there's been a ton of research done by NHTSA on the tech making a strong case for exactly the opposite of what you're saying.

Again - Cite the “ton of research by NHTSA.” Links to actual studies would be nice. It’s funny that you say they have research that points to the safety of technology-assisted vehicles, yet there are numerous investigations by the same agency questioning their safety. Both things can’t be true. (redundant paragraph)

————————————-

from the MSN article I linked:

“As CNN points out, this crash occurred just hours after Tesla CEO Elon Musk made FSD Beta available to all drivers, regardless of if they passed the Tesla safe-driving test conducted by an onboard driver-monitoring system.”

“The National Highway Traffic Safety Administration is also investigating both Autopilot and FSD Beta after years of inaction. Tesla is now reporting fatal crashes involving FSD. Here’s what the probe has found so far, according to previous reporting:”

“First reported in August, NHTSA’s probe targeted 11 crashes involving Teslas. Thursday the NHTSA said that it had identified six more crashes, and that 15 injuries were tied to the crashes, including one fatality. The crashes involve collisions with first responder vehicles, though NHTSA indicated Thursday it would be investigating more than that.”

What’s troubling about this:

The National Highway Traffic Safety Administration is also investigating both Autopilot and FSD Beta after years of inaction.

“After years of inaction...?”

This contradicts your statement, “there's been a ton of research done by NHTSA on the tech making a strong case for exactly the opposite of what you're saying.”

Sounds like the Tesla and the technology got a pass for a long time.

Unfortunately there continues to be a substantial human toll to this inaction.

1

OhioJeeper t1_j1zw1ys wrote

>My points about DUI and distracted driving are relevant as they would directly affect the operation of technology-assisted vehicles, from both a safety and law-enforcement standpoint.

What are you even talking about? What about being drunk is relevant to the operation of a "self driving vehicle" that wouldn't be with a regular vehicle?

>Both things can’t be true.

multiple things can absolutely be true at the same time and I see no point in continuing this conversation as it seems you struggle with that concept, especially if you'd so quickly reject NHTSA research as not being sufficient so quickly without offering up some credible counterpoints. That MSN article you linked is a single data point, not a definitive source.

>A personal anecdote has no validity unless you can provide some specifics or documentation.

Fucking peak irony right here, I'm not about to post up one of my former classmate's personal information for you to tear apart their story because you'd rather steer conversations toward your own shitty misguided view of the world than broaden your perspective. I offered that to give some context to how long this "debate" has been going on.

1

pitchforksNbonfires t1_j20wueg wrote

Stay classy - you elevate the dialogue that way. Profanity is always a nice touch.

People like you always get tweaked when you can’t make an argument without insulting the other party. Every time.

What “NHTSA research?”

Where is it? Link? Point it out, for God’s sake. Is that asking too much? Support your argument.

What about being drunk is relevant to the operation of a "self driving vehicle" that wouldn't be with a regular vehicle?

Being impaired is just as relevant in a self-driving vehicle as in a regular vehicle. That was my point.

You wrote, Why should anyone care if they do or don't get a DUI? Is this about punishing alcoholics or saving lives?

You were the one diminishing the element of impairment - by the above statement.

The msn article is one event. As I’ve repeatedly stated, there are numerous NHTSA safety investigations into accidents involving technology-assisted vehicles - and I provided a link - which you completely ignore because it doesn’t fit your agenda.

Regardless of the fairy tale world you may live in, the safety of these vehicles has not yet been established - based on the numerous accident investigations.

As far as your make-believe anecdote, a story can be relayed without divulging any personal information. You chose not to do that.

You have an elevated view of your debating skills, as evidenced by your dropping the “my legal ethics course” and your “masters,” in your original post. That’s supposed to either impress or intimidate. People who do things like that don’t have the confidence of genuine one-on-one debating, so the “credentials” are supposed to give them an advantage.

It didn’t work.

1

OhioJeeper t1_j21ar8r wrote

>Stay classy - you elevate the dialogue that way. Profanity is always a nice touch.

You're right and I'm sorry, I should have realized there was a child present.

>People like you always get tweaked when you can’t make an argument without insulting the other party. Every time.

Calling you out on being wrong isn't an insult when you're wrong.

>What “NHTSA research?”

>Where is it? Link? Point it out, for God’s sake. Is that asking too much? Support your argument.

🖕I linked you to their site directly, I'm not your mom/teacher/librarian/whoever it was that failed to teach you how to research something on your own. But because I'm in the Christmas spirit, here's a Wikipedia article to get you started.

https://en.wikipedia.org/wiki/Impact_of_self-driving_cars

>Being impaired is just as relevant in a self-driving vehicle as in a regular vehicle. That was my point.

Not when we're talking about technology that would provent drunks from plowing into pedestrians. Police are either responding to accidents or hopefully cathing the person before they kill someone. Self driving tech is always there.

>The msn article is one event. As I’ve repeatedly stated, there are numerous NHTSA safety investigations into accidents involving technology-assisted vehicles - and I provided a link - which you completely ignore because it doesn’t fit your agenda.

I don't have an agenda, but it's starting to make sense why you thought a souce offering up the opinions of politicians on NHTSA research was the same as a source that's directly from NHTSA.

>As far as your make-believe anecdote, a story can be relayed without divulging any personal information. You chose not to do that.

I'm sure more details from my personal anecdote were all it's going to take to convince someone that's arguing against NHTSA based on the opinions of politicians.

>That’s supposed to either impress or intimidate.

The intention was to provide context for the anecdote but it's absolutely hilarious that you think that can be used to intimidate. Don't let those stupid science bitches make you more smarter.

>It didn’t work.

¯\_(ツ)_/¯ can't save everyone from themselves.

1

WikiSummarizerBot t1_j21asq9 wrote

Impact of self-driving cars

>The impact of self-driving cars is anticipated to be wide-ranging on many areas of daily life. Self-driving cars have been the subject of significant research on their environmental, practical, and lifesyle consequences. One significant predicted impact of self-driving cars is a substantial reduction in traffic collisions and resulting severe injuries or deaths. United States government estimates suggest 94% of traffic collisions are caused by human error, with a 2020 study estimating that making 90% of cars on US roads self-driving would save 25,000 lives per year.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

1

chickenonthehill559 t1_j21tr9t wrote

You completely missed my point about drunk and distracted drivers. I would be on the road with a driverless vehicle that has a very small error % rather than be on the road with a drunk or distracted driver. Just because there are laws against driving drunk or distracted, does not mean there are people doing it consistently. Every day there are plenty of dumbasses driving drunk.

1