Viewing a single comment thread. View all comments

speedywilfork t1_jdwo2mg wrote

>If the AI can not recognize an obvious drive-through it would be the AIs fault, but why do you suppose that is the case?

i already told you because "drive through" is an abstraction or a concept, it isnt any one thing. anything can be a drive through. And AI can't comprehend abstractions. sometimes the only clue you have to perceive a drive through is a line. not all lines are drive throughs, and not all drive throughs have a line. they are both abstractions, and there is no way to "teach" an abstraction. We don't know how we know these things. we just do.

another example would be "farm" a farm can be anything. it can be in your backyard, or even on your window sill, inside of a building, or the thing you put ants in. so to ask and AI to identify a "farm" wouldnt be possible.

1

Surur t1_jdwqzq5 wrote

You are proposing this as a theory, but I am telling you an AI can make the same context-based decisions as you can.

1

speedywilfork t1_jdx4i47 wrote

So i have 4 lines, 3 of them are drive throughs. so you are telling me that an AI can tell the difference between a line of cars in a parking lot, a line of cars on a road, a line of cars parked on the side of the road, and a line of cars at a drive through? what distinguishing characteristics do each of these lines have that would tip off the AI to which 3 are the drive throughs?

1

Surur t1_jdx9cvb wrote

The AI would use the same context clues you would use.

You have to remember that AIs are actually super-human when it comes to pattern matching in many instances.

1

speedywilfork t1_jdxdkr6 wrote

i have already told you that anything can be a drive through. so what contextual clues does a field have that would clue an AI into it being a drive through if there are no lines, no lanes, no arrows, only a guy in a chair. AI don't "assume" things. i want to know specifics. if you can't give me specifics, it cannot be programmed. AI requires specifics.

I mean seriously, i can disable an autonomous car with a salt circle. it has no idea it can drive over it. do you think a 5 year old child could navigate out of a salt circle? that shows you how dumb they really are.

1

Surur t1_jdxeibf wrote

> anything can be a drive through

Then that is a somewhat meaningless question you are asking, right?

Anything that will clue you in can also clue an AI in.

For example the sign that says Drive-Thru.

Which is needed because humans are not psychic and anything can be a drive-through.

> AI requires specifics.

No, neural networks are actually pretty good at vagueness.

> I mean seriously, i can disable an autonomous car with a salt circle.

That is a 2017 story. 5 years old.

https://twitter.com/elonmusk/status/1439303480330571780

1

speedywilfork t1_jdxm0sp wrote

>Anything that will clue you in can also clue an AI in.

>For example the sign that says Drive-Thru.

why do you keep ignoring my very specific example then? i am in a car with no steering wheel, i want to go to a pumpkin patch with my family. i get to the pumpkin patch in my autonomous car where there is a man sitting in a chair in the middle of a field. how does the AI know where to go?

I am giving you a real life scenario that i experience every year. there are no lanes, nor signs, nor paths, it is a field. how does the AI navigate this?

1

Surur t1_jdxri6v wrote

What makes you think a modern AI can not solve this problem?

So I gave your question to chatgpt and all its guesses were spot on.

And this was its answer on how it would drive there - all perfectly sensible.

And this is the worst it will ever be - the AI agents are only going to get smarter and smarter.

1

speedywilfork t1_jdy1kdc wrote

>What makes you think a modern AI can not solve this problem?

because you gave it distinct textual clues to determine an answer. Pumpkin patch. table. sign. it didnt determine anything on its own. you did all of the thinking for it. this is the point i am making. it can't do anything on its own.

if i say to a human "lets go to the pumpkin patch". we all get in the car. drive to the location, see that man in the field, drive to the man in the field, that is taking tickets, not the man directing traffic. and we park. all i have to verbalize is "lets go to the pumpkin patch"

An AI on the other hand i have to tell it "lets go to the pumpkin patch" then when we get there i have to say "drive to the man sitting at the table, not the man directing traffic, when you get there stop next to the man, not in front or behind the man" then you pay, now you say "now drive over to the man directing traffic, follow his gestures he will show you where to park" (assuming it can follow gestures).

All the AI did was follow commands, it didnt "think" at all, because it can't. do you realize how annoying this would become after a while? an average human would be better and could perform more work.

1

Surur t1_jdy3nm7 wrote

Gpt4 is multimodal. In the very near future you will be able to feed it a video feed and it won't need any text descriptions.

Anyway, if you don't think the current version is smart enough, just wait for next year.

1

speedywilfork t1_je067dv wrote

you don't understand, in my example it HAS a video feed. how do you think it see the guy in the field? i am presenting a forward looking scenario. i have been developing AI for 20 years. i am not speculating here. i am telling you what is factual. it isn't coming next year, it isn't coming at all. there is no way to program for things like "initiative" and that is what is required to take AI to the next level. everything is a command to AI, it has no initiative. it drives to the field and stops, because to it, the task is complete. it got us to the pumpkin patch. task complete. now what? you have to feed it the next task, that's what. it won't do it on it's own

1

Surur t1_je074un wrote

> everything is a command to AI, it has no initiative. it drives to the field and stops, because to it, the task is complete.

Sure, but a fully conscious and intelligent human taxi driver would do the same.

AIs are perfectly capable of making multi-step plans, and of course when they come to the end of the plan they should go dormant. We don't want AIs driving around with no one in command.

1

speedywilfork t1_je09cgt wrote

>Sure, but a fully conscious and intelligent human taxi driver would do the same.

but not me driving myself, and that is the point. my point is we won't have level 5 autonomy in anything outside of designated routes and possibly taxis. there are things that an AI will never be able to do, and a human can do them infinitely better. so my AI might drive me to the pumpkin patch, them i will take over.

>We don't want AIs driving around with no one in command

this is exactly why they will be stuck at the point they are right now and won't take over tons of jobs like everyone is claiming. they are HELPERS, nothing more. they can't reason, they can't think, they can't discern, they don't have initiative. people will soon realize initiative is the trait of a human that they are really looking for. not performing simple tasks that have to be babysat on a constant basis.

1