Viewing a single comment thread. View all comments

TFenrir t1_j7wibx3 wrote

I think there's all kinds of people here, but I know the type you are describing. I think to a lot of people, the singularity feels like the most likely way they will get heaven.

I wonder where I stand on the spectrum when I'm trying to be self critical. I have a very good life, I make good money, have lots of social... Extra curriculars and fun hobbies, and I sincerely love life and have always loved it.

Would I love a best case scenario for AI? Absolutely, who wouldn't?

But that's not the reason I think it's inevitable. I've been following lots of people who are really really smart, Demis Hassabis, Shane Legg, Illya Sutskevar, and more... People who are actually building this stuff. And I see how their language has changed.

I think you'd also be surprised about how many experts are increasingly moving up their timelines. We can look at forecasting platforms for example, and we can see the shift.

Out of curiosity, what experts are you referencing when you say most don't think we'll get anything transformative anytime soon?

48

Give-me-gainz t1_j7xgjt4 wrote

Depends on how you define soon. Median answer in this survey of AI experts is 2061 for AGI https://ourworldindata.org/ai-timelines

10

GoodySherlok t1_j7z4gxy wrote

I believe this forecast holds validity under the assumption that circumstances remain unchanged, however, that is a dubious assumption. (used chatgpt to properly express my thought)

It is hard to imagine that China and India will not change the trajectory in favor of the optimists.

AGI before 2050.

6

Embarrassed-Bison767 t1_j7z28uv wrote

I suppose if you don't believe in a religious paradise, you'll turn your eye to the closest technology analogue.

2