Viewing a single comment thread. View all comments

beeen_there t1_j09fimw wrote

Can we just stop confusing vast datasets with intelligence please?

45

Readityesterday2 t1_j09vv0s wrote

This has been my stance until recently. I’d ask, would Von Neumann have named these ai and neural networks? Terms of biology, mind you. Or would have gone the route of Matrix Based Linear function Optimization Techniques, etc etc.

Seeing chatgpt, and knowing gpt4 is “like seeing the face of god”, as per one developer, I’m wondering if for all practical and functional purpose we have serious intelligence coming out of these LLMs. You can’t deny how insanely good this shit is. It was giving me jokes from ancient era and they were fucking hilarious.

14

EvenPalpitation6074 t1_j0e3kr6 wrote

>“like seeing the face of god”

True AGI intelligence isn't required for task-oriented task-trained purpose built intelligence, nor does anybody know how to build one. It may arise incidentally with a sufficiently overbuilt task oriented AI, but that doesn't mean we'll have actually built one as people describe them.

Chatbots are like seeing the inner workings of language, any "face of god" is just anthropomorphism on our part.

1

EverythingGoodWas t1_j0e9omy wrote

People see these LLMs and think they are sentient or have some vast intelligence. They are a tool that is designed to be used with other tools. I appreciate what OpenAi has done with bringing visibility to LLMs, but they are by no means “the face of God”

1

TrekForce t1_j09os14 wrote

It is literally defined as artificial intelligence. It is more than just vast datasets. In a large portion of AI programs, they are using “neurons”(not biological ones, but software developed to behave like one). This is why it’s artificial intelligence. ChatGPT for instance (why not, it’s the new big thing) isn’t just quoting things from its vast dataset, it is understanding the language in the prompt and responding with what it thinks is the most probable response based on it’s vast dataset. The response is likely to be nowhere inside that dataset, however.

8

beeen_there t1_j09qct2 wrote

Can we just stop confusing vast datasets plus instuctions for output with intelligence please?

OK? Happy now?

13

TrekForce t1_j0ab6jk wrote

It’s not intelligence. It’s artificial intelligence. But ok. Not sure what your point is. You don’t seem to have one. Nobody is confusing anything except you.

4

Fake_William_Shatner t1_j0jc2ts wrote

>It’s not intelligence. It’s artificial intelligence.

It's really not intelligence yet and therein lies the confusion. There is no THINKING going on. It's Machine Learning with an Expert System and LARGE Datasets. Enough samples and it can sound like it's smart.

Neurons are vastly more complex than just a bundle of connections. They have protein storage for long term information (equivalent to memory in a computer). They have glial cells. And, they sort of function together in an analog way.

There is one gene of difference that separates human intelligence from Chimpanzee. And I think it might be part of the folding in the brain...

Anyway. The collection of parts in various algorithms might get a very close approximation of intelligence. It might be "insightful." But it won't be conscious and not actually intelligent.

Now, the "gestalt" of various systems tied together, could very well be conscious and intelligent. I really thought it was going to require a paradigm shift away from binary computing but much to my chagrin, looks like we are less complicated than we thought -- but, there is one more trick we do that nobody is doing yet with these machines and there might be some interface with quantum effects.

The human brain is doing something like Stable Diffusion on a constant basis. Our perception is slightly in the future -- anticipating our environment -- constantly.

The functional parts that work together to make a human mind all seem to be in development. And like humans, each alone won't be conscious.

Also -- humans I think are MOSTLY conscious. We rationalize more than are rational, and we think we are making choices all the time. But we aren't fully aware of things objectively and take a lot of shortcuts.

And a lot of us choose not to be intelligent on a regular basis, so, once the AI gets the conscious bit, it won't be much of a leap to get ahead.

0

PrimeWasabiBanana t1_j0a21q3 wrote

I thought the probable response was just that - based on the probability of what a human would respond according to it's dataset. So it's still math all the way down, right? I want to say thats not intelligence, it's just doing math. But, I mean, I guess determinism in human thought could be thought of as math too.

−3

M0romete t1_j0bc3ll wrote

But all brains are prediction machines too, just very very sophisticated ones. At a certain point something called emergence kicks in so you can’t just say AI is statistics or math.

2

beeen_there t1_j0bd9yy wrote

>you can’t just say AI is statistics or math.

well of course you can

−4

M0romete t1_j0beaht wrote

Yeah but you’d be wrong. It’s like saying everything in the universe is subatomic particles. While technically true, it’s misleading and leaves out a lot of details.

3

beeen_there t1_j0bgib8 wrote

what, like the techzealot misconception that statistics, math and a set of complicated instructions somehow = intelligence?

No point in arguing over semantics, but imho its obvious enough that human experience, wisdom, feel and emotion are essential to intelligence. And computers don't have those.

They can imitate intelligent output, but they are not intelligent.

−2

monsieuryuan t1_j0bjser wrote

>No point in arguing over semantics

Then proceeds to do exactly that.

It's called artificial intelligence because nobody coded those instructions. The model learned those on its own to exposure and experience just like living things. This is a huge departure from humans explicitly writing the instructions.

>human experience, wisdom, feel and emotion are essential to intelligence

Experience and wisdom are one and the same. So are feel and emotions. So your definition of intelligence boils down to experience and emotions.

Experience is exactly what these things use to learn.

Emotions. The AI models can learn to recognize those and output in consequence if that is their purpose. If you're talking about them feeling emotions on their own, then your defining intelligence as sentience, which AI totally has the potential to achieve.

5

beeen_there t1_j0boped wrote

> AI totally has the potential to achieve.

It really doesn't. But you're obviously in religous faith mode here, a techzealot. Otherwise you wouldn't try to claim experience and wisdom are one and the same, or feel and emotions are one and the same. That demonstrates a incredibly superficial understanding of all those.

An understanding very similar to AI or a bot. An impression of understanding.

−1

monsieuryuan t1_j0c7u6i wrote

Tell me then. That's the difference between experience and wisdom? What's the difference between feelings and emotions that's actually relevant in this discussion?

I'm not a tech zealot at all. I simply understand why they call it artificial intelligence', and it's quite justifiable as a monkier.

Edit: I love how you just call anyone who disagrees with you a tech zealot. And haven't made any substantive argument or demonstrated any in-field knowledge, but the latter would make one a tech zealot right?

6

beeen_there t1_j0ckc5u wrote

Not feelings, feel - different from emotion, if you're creative you'd know what that was. Do you? Paint or compose or write or cook or whatever?

Are you seriously asking me the difference between experience and wisdom? How about you start with a dictionary, then have a think, then come back if you still don't know.

I don't call anyone who disagrees with me a tech zealot, but there is this tech intensity in some people that is like religious fervour, and as such completely misses the main points.

−1

monsieuryuan t1_j0cs1bs wrote

I get what you mean by feel vs emotions now. Though it's of personal opinion how that's necessary to characterize something as being intelligent.

One can easily argue that wisdom is a consequence of acquiring experience. It's part of the decision making or 'instructions' as you put it. So in this sense, wisdom should be captured within experience in this context.

I don't have tech intensity. I don't believe tech will solve all, or believe in AI stuff like self-driving will be imminently achievable. I just understand why they give artificial intelligence that moniker - it learns its own instructions, instead of a human explicitly coding it., which is quite justifiable.

4

beeen_there t1_j0m7hd7 wrote

> I just understand why they give artificial intelligence that moniker

its called marketing

1