Viewing a single comment thread. View all comments

No_Ninja3309_NoNoYes t1_j30vkg5 wrote

Someone went from 'ChatGPT is a great toy' to 'this is some sort of AGI!!!'. We don't even agree on what intelligence is, and why it should be general. I mean, I know wicked smart people, really smart, and they are nowhere near Einstein when it comes to physics. But that is fine right? I know very little about economics, yet I would not say that I have no general intelligence. Can't tell you what general intelligence means, though.

But I think computer vision and language and spatial awareness and simple logic and basic knowledge are a must. And possibly seven other things. The Turing test sounds reasonable, and you have IQ tests, but without a PhD in the relevant field, I don't want to propagate misconceptions. It seems that we're so far in the hype cycle that anything goes.

So I think that we have to calm down. And think things through. What's the worst that can happen? What's the best that can happen? How likely are they? IMO the worst is killer robots. Autonomous or semi autonomous. I think they are unlikely in the short term, but maybe in ten years not so much. The best thing would be in my opinion that we're able to solve many problems and usher in another scientific revolution. Also unlikely since the Einsteins of the world are not blogging or active on social media. They communicate through scientific papers and no one can read those except other experts.

And another thing. This talk of parameters is just misguided. It all sounds like 'I have a penny. If I had billions of dollars, I can buy the moon'. First, more parameters means nothing if the data or programming is bad. Two, you need time and computers to find good values for the parameters. You can think of them as pixels in a picture. This is an oversimplification of course. You need to find the Mona Lisa. For that you need to get the right colours for each dot of the painting. IMO ChatGPT doesn't have all its pixels right. But somehow it beats the competition. The more pixels you have the harder it is to get the parameters right. The space of possible combinations blows up exponentially. If you have ten possible colours, two pixels correspond to hundred combinations, six a million, twelve a trillion. A parameter in a neural network is usually single or double precision floating point numbers at least dozens of bits with potentially tens of thousands of possible values for each of them.

Overall, we don't have AGI yet. (Whatever AGI means) There are good and bad things that can happen, but the more you stretch the narrative, the less likely it is. It's fun to talk about parameters, but it's like talking about the volume of brains. Also I don't understand the obsession with AGI. Specialized AI is fine, right? ChatGPT does a good job if you know its limitations.

1