Viewing a single comment thread. View all comments

Sashinii t1_iyz2qt6 wrote

Both music and video synthesis will be as advanced as image synthesis is now; text-to-game; GPT-4 will falsely be called AGI (but it might be proto-AGI if the rumors are true); the scaled up Gato will at least be approaching proto-AGI; more AI companies will rightfully focus on multimodality; more automation; greater support for basic income because of the aforementioned automation; the ozone layer will continue healing; computer simulations will significantly improve; "The Singularity is Nearer" will be delayed again as is tradition.

73

zero_for_effort t1_iyz7oi7 wrote

"The Singularity is Nearer" will be delayed again as is tradition - lol this rings true.

33

No_Ask_994 t1_iyzl0rt wrote

The singularity is nearer Will be finished by the first ASI

8

Lone-Pine t1_iz1egxq wrote

We are asymptotically approaching the singularity is nearer release date.

4

Prior-Replacement637 t1_iz9k5zh wrote

Do you predict that gpt4 will pass the turing test?

2

Sashinii t1_iza08hl wrote

I think there's a good chance that GPT-4 will indeed pass the Turing test.

3

hducug t1_iyzrg3f wrote

Gpt-4 has nothing to do with a general intelligence. It’s just a language model that predicts what to say and generates text based on that. Not a problem solving ai. It can’t get a 100 iq score on an iq test.

−9

Anomia_Flame t1_iz05s1t wrote

Oh, your working on the project?

9

hducug t1_iz1idpl wrote

What does that have to do with anything? I’m just stating a fact that gpt-4 doesn’t have thinking capacity and not an iq. It just a language model that creates text which it learned from a large variety of data like books, Wikipedia, web articles etc. Is this really all you have to say?

Ps: I can’t believe this community is so childish to downvote me because i crushed their little optimism ego. Some of y’all really are just some npc’s with no thinking capacity sometimes, a lot like gpt-4 actually.

3

PolymorphismPrince t1_iz1w31t wrote

I actually don't think you understand large language models very well, the human brain is almost structurally isomorphic to a stimulus prediction model if you think about it. And basically every stimulus can be encoded in text.

3

hducug t1_iz22ggk wrote

The human brain has something called logic, which the language models don’t have. Logic is literally what intelligence is all about. It doesn’t matter that prediction models work the same as our brain, it has nothing to do with gpt-4 being intelligent.

−3