Five_Decades
Five_Decades t1_ja8htww wrote
We really can't predict it because the underlying science that'll make future technology possible probably hasn't been discovered yet. Someone from the 19th century wouldn't be able to fathom atomic bombs, quantum computers, 5nm processors, etc because the underlying science for these hadn't been invented yet.
I assume matrioshka brains and dyson spheres will exist 200 years from now. Faster than light travel if its possible. Beyond that, who knows. Maybe we will know how to change the laws of the universe by then.
Five_Decades t1_ja491a6 wrote
Reply to comment by Melodic_Manager_9555 in The 2030s are going to be wild by UnionPacifik
Its hard to say. The main reasons people don't have kids is lack of free time, low quality of life, and lack of finances. A world with rampant machine intelligence should change all of those limiting factors.
Also a post singularity world would likely be an interplanetary and interstellar civilization so there would be far more territory to live on.
Five_Decades t1_ja1cti3 wrote
Reply to The 2030s are going to be wild by UnionPacifik
If you compare the time before vs after the industrial revolution, a lot of things changed dramatically.
Economic growth occurred 50x faster. Population grew by a factor of 15-20x. Total GDP skyrocketed. The pace of advances in STEM, medicine, etc improved dramatically.
The same thing will happen when we have machine cognition. Radical advances in economics, science, technology, population. However I don't know when we will hit that period. Hopefully soon, but who knows.
Five_Decades t1_ja1brhn wrote
Reply to comment by spacefarer2245 in The 2030s are going to be wild by UnionPacifik
Maybe an idiot who is too incompetent to do better.
or maybe we invented the idiot because the reality that natural selection invented life is too scary.
Five_Decades t1_j9zq5x6 wrote
> What do people not understand about exponential growth?
Exponential growth in hardware doesn't mean an exponential growth in how useful technology is in our lives. Modern gaming consoles are billions of times more powerful than an original nintendo, but they aren't billions of times more fun and enjoyable.
I have no idea where it will all lead or when, but I don't think Kurzweil is correct in assuming each factor of 1000 that hardware grows means AI will grow 1000x more powerful compared to humans. I have no idea where all this will lead.
I think ASI is inevitable, I just don't know what impact it'll have or when it'll arrive.
Five_Decades t1_j7dkfr9 wrote
Reply to comment by [deleted] in What weak signals or drivers of change—that receive limited attention today—are most likely to create signifiant impacts over the next 10-20 years? Where are the black swans hiding? by NewDiscourse
Yup. There are about 60,000 known diseases, each with endless research papers and risk factors tied to them. Machines will be able to navigate the millions of papers and books to find the most likely cause and cure of each disease. Medicine will be in a new golden age by mid century
Five_Decades t1_j2v5t0r wrote
Reply to Asked ChatGPT to write the best supplement stack for increasing intelligence by micahdjt1221
Add in creatine, Dual-n-Back exercises and hyperbaric oxygen.
Five_Decades t1_j2nd03a wrote
Reply to comment by multiverseportalgun in A Drug to Treat Aging May Not Be a Pipe-Dream by Mynameis__--__
When will they make us a baller?
When will they make us a girl who looks good so we can call her?
Five_Decades t1_ix4cwy7 wrote
Reply to comment by HeinrichTheWolf_17 in is it ignorant for me to constantly have the singularity in my mind when discussing the future/issues of the future? by blxoom
> which everyone is working on in some form or another due to the law of accelerating returns.
I'm not seeing accelerating returns. Yes hardware is growing exponentially which is great. But that doesn't translate into exponential or even linear growth in technology that benefits the human race. Technology as it is applied isn't much better than it was a decade ago despite hardware being 1000x better.
Five_Decades t1_ix4c6va wrote
Reply to is it ignorant for me to constantly have the singularity in my mind when discussing the future/issues of the future? by blxoom
In some ways, yes. Singularianism is just religion for atheist nerds (like myself). Its a desire for a deus ex machina to intervene and help us rise above the boredom, misery, suffering and helplessness that define the human (and biology in general) condition.
We really don't know what a timeline for ASI is, or what ASI will be capable of. When Kurzweil wrote the singularity is near, he predicted we'd have a nanotech revolution in the 2020s and a biotech revolution in the 2010s. Neither happened. I think on a long enough timeline, ASI is inevitable. But I don't know what that timeline is.
There is also the fact that we live in a global oligarchy, and there is a very real chance that AGI or ASI will be used to help the oligarchs maintain their wealth and power rather than make life better for the masses. China is implementing massive surveillance based on AI. We are all addicted to our smartphones. The rich and powerful decide the fate of humanity for the most part. It sucks but its true.
Five_Decades t1_jb6tyfa wrote
Reply to What might slow this down? by Beautiful-Cancel6235
Aside from war with China, and/or massive cuts in financial investments in AI, I don't think anything would slow it down.