Submitted by RolfEjerskov t3_111bnwz in Futurology

Over the last month I've researched Waymo, Cruise, Tesla and motional. Initially I was focused on comparing Lidar to cameras and Radar. Sensors are important, but it became apparent that there are 2 essential steps to that will lead us to self driving vehicles.

  1. In order to solve full self-driving properly, you actually just have to solve real world AI. (this is actually a quote from someone you might know :))
  2. As Level 5 autonomous driving is far away, the main focus right now is to have a fleet that continuously trains the AI model, which means that you over time.

Step 1 is super hard to achieve and the main reason for it is, that you really need to take a HUGE amount of factors into account when driving. It requires much more than AI can deliver right now.

I think that a lot of Tesla owners ask themselves why Radar and ultrasonic sensors have been removed. I believe the answer lies in that you can invest heavily in sensors, but then you might not obtain "the fleet" that will train your AI model.

Taking everything into consideration, we are probably somewhere between Level 2 (Partial Driving Automation) and Level 3 (Conditional Driving Automation). We have a long way to go before we reach Level 5 (Full Driving Automation).

What is your take on it? Looking forward to hear your point of view:)!

6

Comments

You must log in or register to comment.

BigZaddyZ3 t1_j8dvxi0 wrote

I think now that machine learning is beginning to accelerate as a whole, the progress with self-driving cars will accelerate over the next few years as well.

14

RolfEjerskov OP t1_j8ehy74 wrote

I agree, we are definitely on a exponential curve! In my company we work with machine learning and we see the development...

7

LordGothington t1_j8ej9io wrote

Level 3 is basically here this year -- though barely. Mercedes is shipping level 3 this year, but it is only enabled for speeds under 37 mph and some select stretches of highway.

I think Level 3 to 4 will happen pretty quick. Level 4 to Level 5 could take quite a bit of time. But I think many human drivers have not reach Level 5 yet.

Level 4 is pretty darn useful though. A Level 4 vehicle may not have a steering wheel and pedals.

Level 5 is almost an impossible standard. Instead level 4 cars will just get more and more capable. What does level 5 even really mean? Must the car be able to go offroading and rock crawling with ease? Many humans can't even do that well.

5

RolfEjerskov OP t1_j8eksdm wrote

As a family we plan to buy an autonomous vehicle as a robotaxi when level 5 has arrived, because at that point you have a business that takes care of itself...

Level 5 is very difficult to achieve, but once it's achieved we believe that we can use the vehicle just like when we Airbnb our house.
I think that's the main reason lvl 5 will change the world. Your car becomes a business and fever people will require a car, because the cars are pooled...

1

LordGothington t1_j8en3ad wrote

I think a level 5 car business is likely to be a poor way to make money.

In such a system your car is just a commodity, and people will take the cheapest option of similar condition. In such a system, there will be many people who own a car, but are so desperate for cash that they will rent their vehicle for less than what it costs to operate it.

They will sell a ride that costs them a $1 for $0.50 just to get their hands on that $0.50 now, even though it screws themselves over in the future.

Plus you will be competing with people who can buy the cars at a more lower price than use -- such as the car manufactures themselves, dealers, or taxi companies buying huge fleets of vehicles. So their cost of operation will be lower, meaning they can still turn a small profit while you'd be taking a lose.

7

jeremiah256 t1_j8eatwc wrote

I believe manufacturers are just one part of this equation.

If they attempt to do this with each vehicle acting independently, required to react with only the data it can gather and process, then yes. We’ll be doing this dance for at least a decade more, regardless of how intelligent each individual vehicle gets.

But, if smart highways are added to the problem, where information from each vehicle is gathered and relayed to the other vehicles around it, then the problems get simple to solve.

4

Beemer17-21 t1_j8ebnyx wrote

Yes! This doesn't get discussed nearly enough. It's a logistical nightmare but if cars could talk to each other and the road itself we could drastically improve the ability of self driving cars -today-.

5

SELECTaerial t1_j8fnqmz wrote

I’m not saying I’m for or against this, but Smart Highways pose a huge privacy risk

5

jeremiah256 t1_j8fomtx wrote

Oh yeah, there will be serious political battles with self-driving cars & our giving up privacy.

No way law enforcement allows what will essentially be land drones without having access to everything about those vehicles.

2

RolfEjerskov OP t1_j8h8aat wrote

It seems highways are the easiest of self-driving because of its predictability.

The main issue that needs to be overcome it seems, is that AI has to be able to interpret the world as well as a human does...

With that said, if cars can interact the can move as one and it would be a leap forward.

2

BigMax t1_j8mek8m wrote

True but we are a very long way from smart highways being anything other than an incredibly tiny percentage of the roads.

Just installing those flyover type tolls on a single highway in my state took YEARS of planning and construction.

2

Devadander t1_j8dvemg wrote

I think you need to expand your research into GM supercruise as well as Mercedes true level 3 self driving.

3

RolfEjerskov OP t1_j8h8jmm wrote

>GM supercruise

Thanks Devadander:) I've actually looked into Supercruise a bit. I saw it didn't recognise the end of road and just pass onto grass. It seemed a bit off.

I'll dig into Mercedes. Thanks

2

Devadander t1_j8htplc wrote

Lol and teslas hit baby strollers. Super cruise is going to be pretty good. Don’t discount it yet.

Gotta remember, all of these other car manufacturers have to put out a functioning product. Tesla gets away with having their owners beta test software. Mercedes owners won’t put up with that. As these self driving features become available on established marquees, it’ll be quite interesting to see how Tesla keeps up

2

gunfell t1_j8hh9bm wrote

UltraCruise is the GM tech that is really impressive. It comes out in about 18 months, it is seriously impressive stuff

2

Wild_Sun_1223 t1_j8ffr46 wrote

Step 1 requires real intelligence. That's the trick. You need a system that can actually infer and reason, so that it can be "trained" with an amount of data not too different from that needed to train a dog.

3

r2k-in-the-vortex t1_j8hgkfu wrote

First - forget Tesla, their self driving is overpromised, underdelivered, it was never built to be self driving, it was built to sell hype.

All the other self driving companies are a different matter, they have started real world deployments already and are in early stages of scaling up. That scaling up is going to take years and years, but it's happening now and it's not beta, but real world L4 driving.

L5 is unnecessary complication and meaningless goalpost, all driving human or otherwise has limitations.

3

Most-Resident t1_j8f40hp wrote

One thing that might reduce the beta test time is a few major accidents involving 10+ vehicles. That could cause a push for legislation requiring more stringent testing.

I don’t know how much they currently test for bad behavior from other drivers like unexpected lane changes, sudden stops, and combinations currently. I imagine that would require a large testing facility and a bunch of cars they can wreck repetitively. It wouldn’t be cheap.

I don’t think they can just use /r/IdiotsInCars for training.

2

ElectroNight t1_j8h645j wrote

The discussion is fun and all, but the work that Tesla computer vision and machine learning teams are doing is ground breaking and they are way way ahead on the bleeding edge of what is possible, compared to Detroit, Tokyo and Germany.

I've worked in trying to use CNNs and image and depth sensor fusion to accomplish way more simple tasks then FSD and this stuff is really hard and takes a vast commitment of time, energy and capital.

But it indeed might take another decade for a general solution that is at least as good as an average human. But interim performance milestones with minimal but critical occasional human assist could still be very acceptable to some of us

2

AndyTheSane t1_j8hgoeq wrote

General self-driving is an incredibly hard problem.

Even humans make a lot of mistakes and kill people at a significant rate, to the extent that if the concept were introduced today, there is no way that the general public would be allowed to drive as they do; at a minimum, driving tests would be WAY more stringent and applied every 5 years or so.

Driving on a well-maintained motorway/highway (so all the road markings are present!) is the easiest problem to solve. Even so, the self driving system has to keep track of all other cars in all sorts of light conditions, and react appropriately to unexpected queues, or just bad driving by other users. But I think that full-autonomous-motorway driving is at least possible with current technology. And that would be useful; drive to the motorway, press the 'self drive' button, snooze for a few hours and get a wake up alarm 10 minutes before your turn off.

Now, once you get off of the highway, onto poorly maintained rural routes, or even worse towns and cities with all of their bad roads, missing markings, visual distractions, cycles, people, and all the rest, that's a whole new level of complexity. As a human, you can easily work out that a child is about to step off of the pavement in front of you, because evolution has hard wired us to judge the intentions of others. For a computer that's an incredibly hard task. Just working out what is road or pavement in an old town centre can be tough. That's a real general-intelligence problem, and it's why full all-roads self-driving is still a way off.

2

Steamer61 t1_j8i9dcb wrote

Autonomous highway driving is a very different animal than autonomous driving in a city. The highway driving is relatively easy when compared to city driving. There's a bunch of other driving situations that have very different levels of complexity.

I just think about how many accidents that I have avoided in the past. There have been times when I just knew that someone at the stop sign was going to blow thru it, I couldn't tell you how I knew but I did. I have seen other instances where I just knew someone was going to do something total boneheaded, again, there were no obvious clues but my brain picked up on them. As humans we process a ton of data in real time, when there isn't enough data, our brains fill in the blanks. We have intuition, I have no clue how you would make any machine have intuition. I just don't know how you could ever program for those kinds of things.

In the end, I don't think that we will ever have a fully autonomous driving system that works in all situations. Yeah, I know, humans do make a lot of mistakes and maybe autonomous cars will make fewer mistakes, I don't know.

A bigger concern is how my autonomous car would make decisions. What are the priorities? Is the life of the passengers top priority? Will my car sacrifice me to save other persons or people? The questions can get into some pretty philosophical/moral areas.

2

Surur t1_j8duljj wrote

> Step 1 is super hard to achieve and the main reason for it is, that you really need to take a HUGE amount of factors into account when driving. It requires much more than AI can deliver right now.

But may be not for too long in the future.

1

speculatrix t1_j8dzudq wrote

I've been wondering if they should use cameras which can see into the infrared and uv, to give the image processing more information to work on.

1

RolfEjerskov OP t1_j8etnr8 wrote

I think thats the real reason why Radar is used. Lidar and cameras see pretty much the same, whereas the wavelength of Radar can penetrate things like rain and snow, and thereby provide good "sight" under those conditions...

I think going full "predator" mode with infrared and uv would not make so much sensegif

3

Test19s t1_j8dzzqv wrote

I hope it’s possible and we’ll continue to see major progress this decade. I’ve seen some pessimists argue that we’re reaching limits as to what silicon can achieve vs. mammalian brain tissue, and I hope they’re wrong (as self driving is basically the gateway to any real AI takeover of the physical world).

1

AsuhoChinami t1_j8gyiix wrote

Why in the hell was this downvoted? Is there nobody here but Luddites and technoskeptics?

1

gunfell t1_j8hhiv1 wrote

the ones talking about limits are the one's who are creating the processors, they are not pessimists, just realists. silicon has quite a bit of runway left, however other materials will be able to help in the future.

Frankly there may come a time where we use hybridization of classical, quantum, and biological computation since the all have their own strengths and weakness. At least classical and quantum hybridization is considered an obvious future.

1

dontpet t1_j8e6xid wrote

I know it isn't easy to sort this issue but if a distracted ape can achieve it a computer can and will. And it won't take all those high tech sensors either as that ape doesn't need them.

1

RolfEjerskov OP t1_j8etboy wrote

lol, and you can argue that teslas actually have way more cameras than we have eyes...

1

Dry-Influence9 t1_j8g5fh6 wrote

well it took the ape hundred of thousands years of evolution to achieve that level of awareness. These computers have been working on it for less than 20 years.

1

NecessaryCelery2 t1_j8fkyai wrote

It's already spent a decade in beta, the DARPA self driving car challenge was completed in 2005.

How long it's taking to get out of beta makes me suspect there is a fundamental problem, possibly a fundamental design decision, that's the the problem. Possibly trying to use machine learning.

It would be ironic, if better and better AI helps them code self driving software which while being written by AI, does not use AI algorithms.

My point is any time a tech is 99% done, but then takes longer and longer, and longer to reach 100%, it's most likely a fundamental design problem. And the solution is to drop most of the design and do something radically different.

1

RolfEjerskov OP t1_j8h98ou wrote

Thanks for the insight!

It also seems like this rabbit hole is much deeper than we initially expected. I saw a clip with Elon all the way back to 2015, pretty much every year saying that they expect FSV to be fully self driving next year. The AI is just not there yet, and maybe the tools we have right now are just not sufficient...

2

tizuby t1_j8fw53v wrote

A couple of decades at least.

And that's just for the tech to be ready, you can tack on another decade or two for the legal system to figure out wtf to do about it and to iterate on the dumbassery that'll make its way into the first few bills.

Who gets tickets if/when there's a violation of road law, where liabilities fall when a full self auto car inevitably t-bones a school bus (there will not be a 0% accident rate, bugs and glitches happen), regulations and standards around connectivity security, do people still need a DL to operate a fully autonomous vehicle, etc... etc...

1

OwlBeneficial2743 t1_j8g60xe wrote

I suspect this is more of a people thing than a technical challenge. Every time there’s a accident involving an autonomous vehicle, it makes the news and people w too much time in their hands (like me) will pile on. So, does the tech need to be ten times safer, a hundred times, a thousand times than the traditional way, who knows.

At least the issue hasn’t yet become political yet, which is surprising.

1

RolfEjerskov OP t1_j8h81uf wrote

I think that in general autonomous driving is already more safe than human driving. Try to check out this guys experiences driving with FSV.
https://www.youtube.com/watch?v=FGXuVNl8YYc

It's quite clear that the AI is just not there yet. Stops at so many places where humans wouldn't. You simply can't rely on the system...

I'm actually also surprised that autonomous driving hasn't become political yet as well.

1

Redditing-Dutchman t1_j8ksj71 wrote

What about making it easier for self driving cars to navigate in our world? Perhaps special road markings/signs made for self driving AI's. That could close the gap faster.

1

emp-sup-bry t1_j8hxfg4 wrote

This reallllllllly seems like Tesla shilling. And poorly done at that.

0

AsuhoChinami t1_j8gyb3h wrote

Stupid, stupid thread full of stupid, stupid people. Two decades? What in God's name is wrong with you? I hate this stupid fucking sub and I hate everyone here.

−2