pitchforksNbonfires

pitchforksNbonfires t1_j20wueg wrote

Stay classy - you elevate the dialogue that way. Profanity is always a nice touch.

People like you always get tweaked when you can’t make an argument without insulting the other party. Every time.

What “NHTSA research?”

Where is it? Link? Point it out, for God’s sake. Is that asking too much? Support your argument.

What about being drunk is relevant to the operation of a "self driving vehicle" that wouldn't be with a regular vehicle?

Being impaired is just as relevant in a self-driving vehicle as in a regular vehicle. That was my point.

You wrote, Why should anyone care if they do or don't get a DUI? Is this about punishing alcoholics or saving lives?

You were the one diminishing the element of impairment - by the above statement.

The msn article is one event. As I’ve repeatedly stated, there are numerous NHTSA safety investigations into accidents involving technology-assisted vehicles - and I provided a link - which you completely ignore because it doesn’t fit your agenda.

Regardless of the fairy tale world you may live in, the safety of these vehicles has not yet been established - based on the numerous accident investigations.

As far as your make-believe anecdote, a story can be relayed without divulging any personal information. You chose not to do that.

You have an elevated view of your debating skills, as evidenced by your dropping the “my legal ethics course” and your “masters,” in your original post. That’s supposed to either impress or intimidate. People who do things like that don’t have the confidence of genuine one-on-one debating, so the “credentials” are supposed to give them an advantage.

It didn’t work.

1

pitchforksNbonfires t1_j1zsmdv wrote

Why should anyone care if they do or don't get a DUI? Is this about punishing alcoholics or saving lives?

I’m fairly certain that the point of citing and punishing impaired drivers is to make roads safer in order to save lives. Meaning that the “saving lives” part is the primary, not secondary purpose of DUI (and distracted driver) laws.

My points about DUI and distracted driving are relevant as they would directly affect the operation of technology-assisted vehicles, from both a safety and law-enforcement standpoint.

The primary purpose of traffic law is public safety. To say it a different way, the concept and development of traffic laws were borne out of the need for public safety. Is there any question about this?

What flows from that original purpose - the punitive nature of law enforcement - in no way reduces or alters the original and primary intent of traffic laws.

The NHTSA article you link does indeed extol the virtues of technology-assisted vehicles. Interestingly, the agency continues to have numerous investigations into safety problems with technology-assisted vehicles. Weird dichotomy.

The story was widely publicized and what got reported was so far off from what any official investigation found actually happened based off of evidence

A personal anecdote has no validity unless you can provide some specifics or documentation.

This was all at least 5 years ago, and there's been a ton of research done by NHTSA on the tech making a strong case for exactly the opposite of what you're saying.

Again - Cite the “ton of research by NHTSA.” Links to actual studies would be nice. It’s funny that you say they have research that points to the safety of technology-assisted vehicles, yet there are numerous investigations by the same agency questioning their safety. Both things can’t be true. (redundant paragraph)

————————————-

from the MSN article I linked:

“As CNN points out, this crash occurred just hours after Tesla CEO Elon Musk made FSD Beta available to all drivers, regardless of if they passed the Tesla safe-driving test conducted by an onboard driver-monitoring system.”

“The National Highway Traffic Safety Administration is also investigating both Autopilot and FSD Beta after years of inaction. Tesla is now reporting fatal crashes involving FSD. Here’s what the probe has found so far, according to previous reporting:”

“First reported in August, NHTSA’s probe targeted 11 crashes involving Teslas. Thursday the NHTSA said that it had identified six more crashes, and that 15 injuries were tied to the crashes, including one fatality. The crashes involve collisions with first responder vehicles, though NHTSA indicated Thursday it would be investigating more than that.”

What’s troubling about this:

The National Highway Traffic Safety Administration is also investigating both Autopilot and FSD Beta after years of inaction.

“After years of inaction...?”

This contradicts your statement, “there's been a ton of research done by NHTSA on the tech making a strong case for exactly the opposite of what you're saying.”

Sounds like the Tesla and the technology got a pass for a long time.

Unfortunately there continues to be a substantial human toll to this inaction.

1

pitchforksNbonfires t1_j1zobq4 wrote

Of course drivers are fallible. There’s never been a question about humans making mistakes, being distracted, etc.

The argument is whether technology - sensors and computers - can take the place of human senses and judgement.

Is there a difference when a distracted driver hits a wall, vs a technology-assisted vehicle doing exactly the same thing because the computer took a crap, or the vehicle was hacked?

There’s no difference.

Maybe there is a difference. A driver-driven vehicle can, at the last minute - and if the driver regains alertness - possibly avoid a collision. A computer-driven vehicle may not have the same ability. We’d have to have confidence that the computer could recover as quickly as the human could. And we don’t know that it can. There are instances - described in the article links that I posted, saying that the computer doesn’t recover, and the accident happens.

1

pitchforksNbonfires t1_j1xyspp wrote

I’m not looking forward to sharing the road with either occupied autonomous vehicles or driverless ones.

If an impaired driver is in full self-driving mode, would they still get a DUI? Very likely yes, because they’re still supposed to be alert enough to oversee or supervise the self-driving function. And they always have to be ready to take over driving. In these instances an argument can be made that the self-driving function can be safer (once/if they work the bugs out).

Drivers who are distracted by their phones when actually driving their vehicle will likely do exactly the same thing when in self-driving mode. But again, they’re supposed to be constantly monitoring the screen for real-time data on the car’s operating functions. Same argument as the DUI.

Technology is fallible. We’ll see.

1

pitchforksNbonfires t1_j1xv6nv wrote

True.

Drivers should always have the option of disabling optional features that they don’t feel comfortable with.

Sensors and computers can and do malfunction. They can be hacked. It is the responsibility of the driver to determine what, if any, comfort level they may have with technology that has the potential of interfering with their ability to control their vehicle.

2

pitchforksNbonfires t1_j1xus8s wrote

That there are currently several NHTSA safety investigations into different Tesla models indicates the seriousness of concern about self-driving vehicles.

In the instances below, it appears that the technology in question was indeed more dangerous with it than without it:

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

—————————————————

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

−2

pitchforksNbonfires t1_j1xu08q wrote

Facts would differ from your opinion.

The various NHTSA investigations into Tesla’s Autopilot software indicates that there are many safety concerns about ADS vehicles.

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

————————————————-

And last month’s Thanksgiving day 8-car pile up on the Bay Bridge in San Francisco:

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

−1

pitchforksNbonfires t1_j1xsz52 wrote

ADAS vehicles are here to stay, although I’ve read accounts of drivers being surprised/alarmed at how the vehicles react at certain times.

Driverless vehicles are downright scary.

While there are bad drivers on the road, the selling point of this technology is that it is less apt to be fallible in certain circumstances than a human being.

Onboard computers can and do malfunction. They can be hacked. They are fallible, no less than a human driver. Sensors and a computer can’t take the place of eyes, ears and (hopefully) an informed, experienced driver.

The NHTSA article doesn’t mention how ADAS/ADS vehicles factor into the data, though there are currently several NHTSA investigations into some Tesla models and their Autopilot software. There have been accidents, injuries and fatalities.

——————————————-

https://www.yahoo.com/entertainment/tesla-driver-watched-horror-another-125137176.html

This one burst into flames after hitting a barrier.

”He got out and spoke to the driver of the crashed Tesla, who was not injured in the incident. The driver told Kaplan he had his 2018 Model X in Autopilot but "it suddenly veered hard to the left and stopped against the wall."

As far as the prevalence of ADS vehicles being rear-ended, some could be due to sudden and unexpected braking, as happened on Thanksgiving in San Francisco on the Bay Bridge:

https://www.msn.com/en-us/autos/news/tesla-in-full-self-driving-mode-caused-8-car-pile-up-report/ar-AA15zmQJ

”The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph. That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

”Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

0

pitchforksNbonfires t1_j1wpta1 wrote

Driverless vehicles

Governor Wolf signed a bill in November that allows for the regulation and operation of “highly automated vehicles with or without a driver.”

“This technology brings the potential for significant advancements in vehicle safety and mobility, and offers economic development benefits across Pennsylvania,” Wolf said in a signing letter.

Just four months earlier...

https://www.govtech.com/fs/nhtsa-releases-new-data-about-autonomous-vehicle-crashes

Vehicles with driver assistance systems and autonomous technologies have been involved in hundreds of crashes in the last year.

Newly released data from the National Highway Traffic Safety Administration (NHTSA) details when crashes occurred in vehicles equipped with Advanced Driver Assistance Systems (ADAS), the driver assistance features found on many cars, and Automated Driving Systems (ADS), which refer to autonomous technologies being tested — and in some cases deployed — on public streets and roadways.

Vehicles with ADAS have been involved in 392 crashes in the last year, according to the federal highway safety agency. Six of those were fatal, five resulted in serious injuries, with 41 resulting in minor or moderate injuries. Four involved a “vulnerable road-user,” such as a cyclist or pedestrian.

——————————————————-

The new PA law does not make our roads safer - it does exactly the opposite.

18

pitchforksNbonfires t1_it277tb wrote

Get a camera. Put up private property and no trespassing signs that are plainly visible.

You also might want to talk to a lawyer.

Anyone on your property is potentially your liability. Although those ATV’s are supposed to be registered, titled and insured, it’s likely that some are not.

The DNR website states that all law enforcement in PA are authorized to enforce the Snowmobile/ATV law, including state police.

You might not want to call them, but if there’s an accident or an incident on your property they will need to respond. It’s their job to cover rural areas when there’s no local PD.

There also must be a sheriff’s office that’s in your county.

42