Gortanian2

Gortanian2 OP t1_je0yq76 wrote

“An army of Einsteins and von Neumann's in constant, rapid communication that never sleeps, never forgets, and never dies.“

I wonder how fruitful those conversations would be if one already knows everything the other one knows. I think it may become something more like an einstein-level intelligence with an army of bodies to explore with. A hivemind.

Thank you for your comment, it has given me new ideas to ponder. And I agree. We would not need unbounded exponential growth to drastically shape our reality.

1

Gortanian2 OP t1_je0ww5u wrote

Reply to comment by qrayons in Singularity is a hypothesis by Gortanian2

You make an excellent point. Even a basic AGI would be able to absorb an insane amount of knowledge from its environment in a matter of weeks. Thank you for your comment, it has altered my perspective.

1

Gortanian2 OP t1_jdyxz07 wrote

I don’t believe they’re treated as “normal,” but it’s almost impossible to refute something like faith.

There’s absolutely nothing wrong with being excited about the real possibility of a better future.

1

Gortanian2 OP t1_jdythpp wrote

It seems obvious right? Just tell the AI to rewrite and improve its own code repeatedly, and it takes off.

As it turns out, recursive self-improvement doesn’t necessarily work like that. There might be limits to how much improvement can be made this way. The second article I linked gives an intuitive explanation.

7

Gortanian2 OP t1_jdxrco9 wrote

The first sentence is true and I agree with you. The second sentence is not. Feral children, those who were cut off from human contact during their developmental years, have been found to be incapable of living normal lives afterwards.

1

Gortanian2 OP t1_jdxp2f9 wrote

It’s truly fascinating. And I agree that it is a possible risk. But I don’t think people should start living their lives as if it is an absolute certainty that ASI will solve all their problems within the next couple decades.

My point is that people should consider both possibilities: either the singularity will happen, or it won’t. And there are well thought-out arguments for both sides even if we disagree with them.

7

Gortanian2 OP t1_jdxofbi wrote

Thank you. I completely agree with all of this. The criticism I’m raising is against a literal singularity event. As in, unbounded recursive self-improvement where we will see ASI with godlike abilities weeks after AGI gets to touch its own brain.

But I agree that AGI is going to change the world in surprising ways.

21

Gortanian2 OP t1_jdxkjna wrote

  1. Very strong counter argument. Love it.

  2. Again, strong, but I would argue that we don’t know where we are in terms of algorithm optimization. We could be very close or very far from perfect.

  3. I would push back and say that the parent doesn’t raise the child alone. The village raises the child. In todays age, children are being raised by the internet. And it could be argued that the village/internet as a collective is a greater “intelligence agent” making a lesser one. Which does bring up the question of how exactly we made it this far.

1

Gortanian2 OP t1_jdxdpev wrote

Thank you for your response. The logistical issues I see in these articles that get in the way of unbounded recursive self-improvement, which is thought my many to be the main driver of a singularity event, are as follows:

  1. The end of moore’s law. This is something that the CEO of Nvidia himself has stated.
  2. The theoretical limits of algorithm optimization. There is such a thing as a perfect algorithm, and optimization beyond that is impossible.
  3. The philosophical argument that an intelligent entity cannot become smarter than its own environment or “creator.” A single person did not invent chatGPT, is instead the culmination of the sum total of civilization today. In other words, civilization creates AI, which is a dumber version of itself.

I do not believe these arguments are irrefutable. In fact, I would like them to be refuted. But I don’t believe you have given the opposition a fair representation.

3