Five_Decades

Five_Decades t1_ja8htww wrote

We really can't predict it because the underlying science that'll make future technology possible probably hasn't been discovered yet. Someone from the 19th century wouldn't be able to fathom atomic bombs, quantum computers, 5nm processors, etc because the underlying science for these hadn't been invented yet.

I assume matrioshka brains and dyson spheres will exist 200 years from now. Faster than light travel if its possible. Beyond that, who knows. Maybe we will know how to change the laws of the universe by then.

4

Five_Decades t1_ja491a6 wrote

Its hard to say. The main reasons people don't have kids is lack of free time, low quality of life, and lack of finances. A world with rampant machine intelligence should change all of those limiting factors.

Also a post singularity world would likely be an interplanetary and interstellar civilization so there would be far more territory to live on.

3

Five_Decades t1_ja1cti3 wrote

If you compare the time before vs after the industrial revolution, a lot of things changed dramatically.

Economic growth occurred 50x faster. Population grew by a factor of 15-20x. Total GDP skyrocketed. The pace of advances in STEM, medicine, etc improved dramatically.

The same thing will happen when we have machine cognition. Radical advances in economics, science, technology, population. However I don't know when we will hit that period. Hopefully soon, but who knows.

35

Five_Decades t1_j9zq5x6 wrote

> What do people not understand about exponential growth?

Exponential growth in hardware doesn't mean an exponential growth in how useful technology is in our lives. Modern gaming consoles are billions of times more powerful than an original nintendo, but they aren't billions of times more fun and enjoyable.

I have no idea where it will all lead or when, but I don't think Kurzweil is correct in assuming each factor of 1000 that hardware grows means AI will grow 1000x more powerful compared to humans. I have no idea where all this will lead.

I think ASI is inevitable, I just don't know what impact it'll have or when it'll arrive.

1

Five_Decades t1_j7dkfr9 wrote

Yup. There are about 60,000 known diseases, each with endless research papers and risk factors tied to them. Machines will be able to navigate the millions of papers and books to find the most likely cause and cure of each disease. Medicine will be in a new golden age by mid century

4

Five_Decades t1_ix4cwy7 wrote

> which everyone is working on in some form or another due to the law of accelerating returns.

I'm not seeing accelerating returns. Yes hardware is growing exponentially which is great. But that doesn't translate into exponential or even linear growth in technology that benefits the human race. Technology as it is applied isn't much better than it was a decade ago despite hardware being 1000x better.

4

Five_Decades t1_ix4c6va wrote

In some ways, yes. Singularianism is just religion for atheist nerds (like myself). Its a desire for a deus ex machina to intervene and help us rise above the boredom, misery, suffering and helplessness that define the human (and biology in general) condition.

We really don't know what a timeline for ASI is, or what ASI will be capable of. When Kurzweil wrote the singularity is near, he predicted we'd have a nanotech revolution in the 2020s and a biotech revolution in the 2010s. Neither happened. I think on a long enough timeline, ASI is inevitable. But I don't know what that timeline is.

There is also the fact that we live in a global oligarchy, and there is a very real chance that AGI or ASI will be used to help the oligarchs maintain their wealth and power rather than make life better for the masses. China is implementing massive surveillance based on AI. We are all addicted to our smartphones. The rich and powerful decide the fate of humanity for the most part. It sucks but its true.

1