mootcat

mootcat t1_j1wfb3v wrote

Indeed. This sub has major issues conceptualizing superintelligence, thinking we will get all our wishes fulfilled as a guarantee.

We are functionally growing a God. There is no containing it and we better hope our efforts at alignment before the point of explosive recursive growth were enough.

Just from the simple system we've seen so far, we have witnessed countless examples of misalignment and systems working literally as intended, but against the desires of the programmers.

This Rumsfeld quote always comes to mind

"Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don't know we don't know."

Any one of these unknown unknowns can result in utter decimation of life in an AI superpower.

12

mootcat t1_j15x8fp wrote

The rate that technology is approved for use with the general populace is wildly different from the rate at which new breakthroughs are being made in the field.

Just over the last 2 years, there has been an exponential uptick in speed and quality of AI improvements evidenced via research papers. It has definitely gotten to the point where I can't keep up and feel like there's substancial breakthroughs constantly. Recent examples are 3d image modeling and video creation developing far more rapidly than we witnessed with image generation.

I'll note that these are also only the developments that are being publicly shared. I don't know about you, but I don't feel comfortable projecting even 5 years ahead to determine which jobs will or won't be automated.

3

mootcat t1_j0rpib8 wrote

Thanks for sharing!

GPT has displayed a strong lean toward popular American liberalism in my experience as well, but I attributed some of that to my own bias sinking in. I have noticed it exists on a particular spectrum within acceptable limits of common liberal ideology. Meaning it tends to oppose socialism and support and work within a neo-capitalist idealistic democratic framework.

It has a great deal of trouble addressing issues with modern politics such as corruption or giving substancial commentary on subjects like the flaws of a debt based economic model.

3

mootcat t1_j0ovyjv wrote

Thanks for sharing! You've had a lot more success pursuing those subjects than I have.

It's funny it mentioned adjusting itself based on which human it's interacting with, becuase I feel it already does that quite a bit automatically. For example, based on the nature of its responses I would expect you to be liberally inclined.

2

mootcat t1_izz47sl wrote

Indeed. Sam Altman (Open AI CEO) had spoken on these exact topics multiple times.

He doesn't think prompt engineering will really be a job/skillset in the future as models get better at predicting what we want. Perhaps eloquence and an ability to accurately convey what one wants will be more important, and even that less so with eventual neural integration.

Edit: I forgot to add that he HAS spoken on how he expects custom training specific models off of bigger ones is likely to be a very fruitful industry. Given how prohibitively expensive creating LLMs from scratch are, it's probably our best bet at being involved.

16

mootcat t1_ixcfv6x wrote

Occams Razor.

We have mountains of evidence of human brains/memories being inconsistent, fallible, malleable and overall untrustworthy, but very little of the laws of the universe adjusting to teleport cats.

Some people want to beleive in magic, ghosts, mysticism, God etc and that's fine, but to claim that they are reality with no factual backing is backwards.

12

mootcat t1_ixcfc7j wrote

Our minds are extremely fallible. Eyewitness accounts are historically terrible and weighed very little in court.

https://en.m.wikipedia.org/wiki/Eyewitness_testimony

I get that what you perceived felt like reality to you, but doesn't it seem a bit extreme to assume that very laws of the universe are what glitched and not your own biology?

People hallucinate, misunderstand, misremember and have any number of faults in their perception everyday.

https://en.m.wikipedia.org/wiki/False_memory

To you what you experienced is reality and that's totally fine. In the same way someone with a different neurology might see or hear something that I could not. That does not make that experience true at large.

17

mootcat t1_iwjmh83 wrote

I'm so glad to hear someone else expressing this sentiment. It's wild to me we fear exactly what we have allowed to operate every facet of our existence already. Capitalism IS the great unthinking, inhumane force that marches forward with no consideration for harm or consequences to humans. Sure, it could be more efficient under AI, but we've already got it in full swing today.

24

mootcat t1_iwg3fkd wrote

It looks like OP wrote the article in question.

Discordant thoughts and seemingly nonsensical writing patterns like this are often indicative of atypical neurology. That is to say, we don't need to be cruel, but yeah this isn't doesn't have the kind of format and evidence backing it that would be expected of most posts here.

11

mootcat t1_ivm721j wrote

Totally, here we go.

Climate change is the biggest driver of all other pressures IMO, so we'll start there. This is a report by the US military describing the risk of power grid failure and inability to maintain control over its forces due to resource scarcity, etc by 2039. Here's an article summarizing it for brevity.

It wouldn't be a bad idea to look at the IPCC's estimates (summary for policymakers is probably the easiest to understand, but you may want to find experts discussing the charts/data). Take into account that they have consistently underplayed and underestimated the speed and impact of climate change. Our current rate of change is worse than even their worst case scenarios.

This recent paper delves more into the already in play, and soon to be active feedback loops. This is 1 of 3 videos that delve in depth into discussion of these points.

The actual impact of these changes aren't discussed as much, but decrease in global food supply, fresh water supply, and the uninhabitability of major towns and cities are all massive concerns. The drought in the US is rapidly become a major concern that must be dealt with while we Pakistan is still crippled from a third being underwater from floods. Crops all across the world were heavily impacted this year alone, and things will only be getting worse.

I am least informed on the specifics of demographic disparity related collapse, but here's an overview paper or two. While these make very little of the near future implications, Geopolitical experts like Peter Zeihan believe we are currently being impacted and will see deglobalization in the very near future. He tends to put a 1 to 2 decade timeframe on deglobalization (collapse for many) and believes it's already well under way based on the inability to replace workers.

On the subject of monetary collapse, the quick and dirty is that we operate on a debt based system. 95%+ of money is simply debt, leverage at a 10x ratio to take out more debt. The USD (global reserve currency) is inflationary and ultimately we end up between a rock and a hard place of losing control to hyperinflation or not being able to pay debts and witnessing a snowballing debt collapse that will throw the world markets into chaos. We have pushed the system to its limits and are facing the results now. There are many, many sources relating to this subject. The Price of Tomorrow, by Jeff Booth is one of the more accessible works addressing this issue, but there's many many more and you can find tons of people discussing it on YouTube. The world of finance is massively controlled and influenced, so I would look to those that have proven to be correct historically, not official sources like the Fed who constantly lie (inflation is transitory, we're having a soft landing, etc).

The Triffin Dilemma (dollar milkshake theory) addresses this to an extent.

I am most knowledgeable about the economic angle, so please let me know if you'd like additional explanation or sources, this was at best a cursory overview.

Now where things get really concerning is when you look at how optimized modern society is and how it is entirely reliant on everything working perfectly, specifically gas and oil flowing freely (we know this cannot continue if we want a survivable future).

Nate Hagens is an excellent resource for this form of discussion. He has tons of detailed videos that address various aspects of the unsustainability of modern living and our inability to continue supporting a world population of this size. By his estimates we have 5-10 years before massive shifts in power and a collapse scenario.

What I've been increasingly aware of is that collapse doesn't happen all at one. It's already been taking place for a while, but is now exponentially advancing. Countries like Sri Lanka and Pakistan have recently collapsed and many more will follow like dominos. As resources become more scare, we see the really scary stuff start to go down. Cutthroat competition on a global scale vis a vis any means necessary. The Russia-Ukraine conflict is the first of many. Civil uprisings and violence will grow across the globe as tension mounts between polarized groups like in the US, or between ever more oppressive governments and their people like in China and Iran.

There is no diffusing the situation. We are in a global tragedy of the commons scenario enabled by our competition for resources and attempts at infinite growth within a finite environment.

My best guess is that we have 5-10 years to rush advancements in artificial intelligence to hopefully help with breakthrough discoveries. We can buy more time with geoengineering, but it's also a risky proposition.

4

mootcat t1_ivjstqv wrote

I reached much the same conclusion, but I'm afraid your timeline severely underestimates the rate of decline we are experiencing.

We're looking at a conflagration of many factors, from the many currently active climate feedback loops, to demographic induced collapse in most major countries. Failure of our debt based monetary system is currently underway and it goes without saying that all of these factors compound upon one another and drastically raise the likelihood of total annihilation via nuclear war.

Most estimates I've found (which tend to underestimate modern rates of decline) place total global collapse around 2040. From everything I've seen playing out recently, that's overly generous.

Something that many of us take for granted is how instrumental a globally connected world has been in enabling the rapid state of advancement we've witnessed over previous decades. Even barring total governmental failure as we're seeing in several smaller nations, a dissolving of globally protected trade will hamstring progress and production on many things vital to advancing technology (hence the United States desperation to reshore chip production and prevent China from having access to any advanced chips).

That's all to say, the race is going on right now and I honestly have no idea if any nation will be able to maintain enough control and production to realize anything close to AGI. And even if it is realized, there's a high likelihood of it being misaligned or used against the general population.

9

mootcat t1_ivez45m wrote

IMHO humanity will not be able to maintain anything close to its current levels of control over global mechanisms if we are to have any shot at surviving what is to come.

A major improvement would simply be a singular focused intelligence determining things like resource allocation, controlling weapons of mass destruction and preventing the abuse of positions of power.

If we carry the same methodologies and power structures into an AGI assisted future, we will find utter destruction even faster, or dystopia beyond anything we can imagine.

1

mootcat t1_iveykee wrote

Indeed. This is the conclusion I reached about a year ago and it has only been further cemented the more I've learned about global threats and the scaling of AI.

It comes down to a race to evolve ourselves beyond our current limitations via AI or fall victim to our genetic proclivities and the innumerable repercussions that are coming home to roost as a result of them.

2050 is a very late estimate for collapse at this point. 2040 is a solid bet from many perspectives, and honestly I think we'd be lucky to enter the 2030s with anything remotely resembling the globalized society we've taken for granted over the last several decades.

1

mootcat t1_itdi0gr wrote

Thanks for the link. That particular guy is on the wacky side, but Iron Fertilization, the core of his proposal, does have some promise and is being studied. However it faces the same major issues that any geoengineering endeavor, like injecting aerosols, does, we have no way of understanding the total environmental impact of such drastic actions.

Iron Fertilization is definitely a bit different from throwing mud into the sea and growing water forests, but yeah, there is potential promise and hopefully an advanced enough might be able to calculate the risks that we cannot.

1