Viewing a single comment thread. View all comments

MrZwink t1_izh2c3k wrote

It's not intelligence per say. Think of it more as automating cognitive functions. Computers are getting better than humans at many cognitive abilities. But they still lack common sense.

6

Drakolyik t1_izh7o4g wrote

Define common sense.

5

MrZwink t1_iziexmn wrote

They find correlation, not causation.

This means they have notorious difficulty with queries that make no sense. A good example is Galactica, facebooks scientific paper ai. asking it for the benefits of eating crushed glass. And it tries to answer. It doesn't notice the question is flawed. It just tried to find data that correlates to the query. And makes stuff up.

It is the question if we will be able to ever teach ai common semse.

6

PeartsGarden t1_izk8bru wrote

Yeah but what if you never told a child about crushed glass? What if that child never dropped a glass, and never cut his/her finger while cleaning the mess? What would a child say?

Would you say that child lacks common sense? Does that child lack experience (a training set)?

2

MrZwink t1_izl1el1 wrote

I'm not getting in a whole filosophical debate. These ai's aren't meant to be a child that gives it's opinion on a subject. They're expected to be oracles. And they're just not good enough yet.

2

PeartsGarden t1_izl89y7 wrote

> they're just not good enough yet.

My point is, that specific AI's training set may have been insufficient. The same as if a child's experiences are insufficient. I think we can both agree that a child has common sense, at least a budding version of it.

1

MrZwink t1_izlbrmv wrote

it's not the training set that is the problem. It is the way statistics approach the problem. Correlation is not causation. Ai's are a tool to automate cognitive processes. Nothing more. We shoulnt expect them to be oracles.

2