Submitted by apple_achia t3_ynmu55 in singularity

How do you square your belief in a coming technological singularity with the impending climate crisis? If roughly 1 billion people, by conservative estimates, are going to be displaced by 2050, do you believe the institutions necessary to usher in this singularity will still be functional? What about during the energy crises experts believe we will see at some point during our lifetimes prior to full decarbonization? Or in the midst of the increasing geopolitical tensions caused by increasing scarcity? It seems to me, any major rock of the global boat could throw back any prospective technological gains far out of reach for anyone alive today. How do you square these competing realities of technological progress and ecological degradation?



You must log in or register to comment.

sideways t1_iv9q2w9 wrote

I think that AGI is the only viable path out of the deep ecological hole we, as a species, have dug for ourselves.

Of course, it's possible that a sudden acceleration in collapse derails the Singularity and modern civilization with it. Given that, I think we'll just barely make it.

Advances like AlphaFold, AlphaTensor and AI fusion plasma control are just the beginning. We're starting to grasp dramatically more effective scientific and technological tools that will enable us to solve currently intractable problems.


apple_achia OP t1_iv9yjiz wrote

So as to how AGI will solve the climate crisis: we already know the problem is fossil fuel consumption and excessive resource extraction. This is the human activity in question Are you suggesting that AGI would coordinate human economic activity to prevent climate change in some way? Perhaps a way that would limit resource consumption to a sustainable level and assure a relatively equitable distribution of wealth and agency?

It may be able to figure out a technical fix to a few problems, like perhaps fossil fuels are eliminated by nuclear fusion, but then we have waste storage to deal with, as well as agricultural land coordination and management. And if that’s solved, population growth may become so explosive that becomes a problem. And then you cant necessarily solve that with new technology, until we can terraform space, we’d have to make some difficult decisions.

If this is the case, how would this AGI coerce the non cooperative or police it’s boundaries, if someone were to go and try and open an extra oil drill too many, or clear cut a vital piece of forest? Strike out on their own in some unacceptable way.

Would AGI then, have a monopoly on force/ coercion as well as economic boundaries and therefore amount to the AGI acting as a state?


sideways t1_iva1pv5 wrote

It's important to recognize that, as AGI enables other disruptive technologies, nobody can really predict how things are going to work out. That's kind of the premise of the Singularity.

An AGI somehow figuring out post-scarcity economics and genuine democracy certainly could come to pass - but I expect it would have to be an ASI and I'm not smart enough to suggest how it would work.

In the near term I'd expect AGI to facilitate more technical solutions like large scale carbon capture, advanced materials science for batteries and renewables and dramatically better and cleaner energy sources. Advances like these could solve the issues you brought up without requiring full-scale economic/social restructuring.


TheLastSamurai t1_ivbilcb wrote

It could also do something like Make solar 1000 more efficient or create synergetic algae to capture carbon, a lot of possibilities but I wouldn’t pin our plans on it, it’s not a given


EscapeVelocity83 t1_ivcjdcw wrote

An ago figuring it out is not different from anyone else. The people in power don't want those solutions because then they wouldn't be in power


RikerT_USS_Lolipop t1_ivc8iea wrote

> like perhaps fossil fuels are eliminated by nuclear fusion, but then we have waste storage to deal with,

You need to learn about nuclear energy waste.


mootcat t1_ivez45m wrote

IMHO humanity will not be able to maintain anything close to its current levels of control over global mechanisms if we are to have any shot at surviving what is to come.

A major improvement would simply be a singular focused intelligence determining things like resource allocation, controlling weapons of mass destruction and preventing the abuse of positions of power.

If we carry the same methodologies and power structures into an AGI assisted future, we will find utter destruction even faster, or dystopia beyond anything we can imagine.


mootcat t1_iveykee wrote

Indeed. This is the conclusion I reached about a year ago and it has only been further cemented the more I've learned about global threats and the scaling of AI.

It comes down to a race to evolve ourselves beyond our current limitations via AI or fall victim to our genetic proclivities and the innumerable repercussions that are coming home to roost as a result of them.

2050 is a very late estimate for collapse at this point. 2040 is a solid bet from many perspectives, and honestly I think we'd be lucky to enter the 2030s with anything remotely resembling the globalized society we've taken for granted over the last several decades.


BrainBoy000 t1_iv9ovzu wrote

Nuclear fusion and carbon capture


Mr_Hu-Man t1_iva6971 wrote

Nuclear fission, direct air capture, green-powered desalination plants, wide scale adoption of renewables other than fission, increased capacity to store energy, nature-based solutions that put biodiversity at the forefront: these are how we can do it today. Then if fusion comes along: SORTED.


LausXY t1_ivaas99 wrote

Yeah this idea we just need to wait for the tech to be invented makes me nervous and it feels a bit like passing the buck to the next generations. There is lots we can do in the meantime as we're trying to make our own little Sun


naossoan t1_ivaqfjc wrote

Carbon capture is complete bullshit unless it's powered 100% by nuclear, geothermal, or some other kind of renewable energy.

The largest carbon capture plant today in Iceland only removes 4000 tonnes of co2 per year, which is roughly 800-something cars.

Being Iceland I'm assuming it is powered by geothermal, which is great, but Iceland is one of the few places on earth where geothermal energy is relatively easily obtainable.

If a cartoon capture plant is powered by coal or natural gas power it's nearly completely negated, at the very best reduced to negligible offsets, similarly to electric cars.

It's much more effective to remove those 800 something cars from the road in the first place.


shiddyfiddy t1_ivawgm6 wrote

That was a proof of concept plant. They have better funding now and the next plant will do about 35k tonnes per year.

West coast of north america has a lot of suitable sites.

This idea isn't close to being toast. Not yet anyway.


naossoan t1_ivb1t1y wrote

It's also nowhere close to being good, either.


maxiderpie t1_ivat00h wrote

I really wished people stopped looking at carbon capture (as in, direct air capture) as a viable solution to climate change and considered it for what it truly is, a publicity stunt energy companies can use to say they're doing something about it while continuing to profit off fossil fuels without a care in the world and starving of funds those solutions that go against their interests (this video by AdamSomething gives some really nice info about it).

Reducing CO2 production at the source is the only viable way to tangibly slow down the effects of climate change. Nuclear fusion will hopefully be the silver bullet needed to actually turn around the situation.

Hell, at this point I would even be ok with skynet taking the reins of global government and go about fix this mess we've made. I mean, given the rate of advancement in the AI field in the last five years, we at least have some good reasons to hope so.


red75prime t1_ive1jo3 wrote

Extra CO2 that is already out there is not going away if we stop burning fossil fuels. Well, it goes away by natural means like phytoplankton and forest carbon capture, but too slow. Anyway, usage of carbon capture as a publicity stunt doesn't contradict it's usefulness in combating climate change. People just need to recognize when it's being used as a deception (but, yeah, it may be a bit too high standard to meet).


maxiderpie t1_ive38dq wrote

It's absolutely true that we need to find a way deal with the extra co2 in the atmosphere, thing is that, since carbon capture is a very inefficient process (gas density and all that), it only becomes a viable method when there are no more easy avenues to reduce other sources of carbon emission.

So, in a future society where every single energy source is green (i.e. nuclear, geothermal, solar etc.), carbon capture would absolutely be considered a good option to reduce co2 in the atmosphere. Today though, not so much, as every little bit of green energy should be directly dedicated to phase out fossil.


red75prime t1_ive4zwp wrote

Solar and wind power has a problem with intermittency, you need to store energy oftentimes (or set negative prices). With right incentives air-to-syntetic-fuel process could probably be made a viable alternative to storing excess energy in hydrogen or some other form.

Solar updraft tower, for example, can provide both energy and airflow.

ETA: Ah, I see the problem. You also need to pay for permanent carbon storage and there's conflict of interests. Why would you bury all that carbon if you can profit on fuel? It applies to privately owned facilities as well as governments.

On the other hand, going carbon negative requires political will in either case, and if you go air-to-fuel route you'll have carbon-capture-ready infrastructure.


[deleted] t1_iv9pvbf wrote



Kaarssteun t1_ivbjmuw wrote

not gonna crush your dreams, because i will make it my life's mission to make that happen!


apple_achia OP t1_iv9q8ck wrote

I think if you designed this and advocated for it you’d probably be burned at the stake and not for witch craft. Nobody wants to live in the matrix pal


[deleted] t1_iv9r2iq wrote



apple_achia OP t1_iv9rd60 wrote

Ok sorry. Nobody wants to live in Nozick’s experience machine

My point stands. Nobody wants that, in fact that’s like the entire point of the thought experiment


[deleted] t1_iv9s3lz wrote



apple_achia OP t1_iv9s9ue wrote

Because it’s fake. And no matter how realistic a recreation it is, you have no agency over the material universe. Any meaningful event in life is replaced by an experience of it.

Have you no will to power? No will to act? Instead of living you’d rather enter Plato’s cave and sit idly by and ogle at the pretty shadows on the wall.


[deleted] t1_iv9t6ll wrote



apple_achia OP t1_iv9to8t wrote

And I think you’re not grasping the nature of representation versus reality. This is Not a Pipe. End of story. And if you’d rather secede from reality to enjoy the falsities of images, just because they promise you infinite dopamine, and you truly believe that wouldn’t lead into a self referential loop of madness, if you’d give up your birthright for shadows on the wall, then do it. The rest of us will work on making something worth living for. Go on, cut yourself off from the tree of life to a simulacrum, this technology would be nothing but a new method of assisted suicide, or did you forget that life actually needs to propagate itself to keep going. Good luck simulating that. I’ll be building it, with all of the risks and difficulties entailed.


[deleted] t1_iv9uh34 wrote



gangstasadvocate t1_iv9y7qm wrote

We already trip to try to broaden our experience horizons. I would definitely want to be in something like this


apple_achia OP t1_iv9vsid wrote

You seem to believe what we perceive IS reality and therefore if I stimulate my brain to believe there is a steak in my belly, there may as well be. You have a shallow understanding of what is constructed and what is objective. None of us perceive objective reality, it’s ontologically impossible, but we are having a physical impact on the universe and our senses, as well as how we construct our understanding from the stimuli those senses provide, are our only shot and understanding that reality. And obscuring that through a self made deception of the senses is NOT reality.

Sure, I’ll concede many people would definitely choose this. But that would functionally just be mass suicide by a pleasurable means. It’d be treated in society the same way a heroin overdose is. After all, that’s also tricking the senses into experiencing bliss, while your physical body withers away.

And then there’s the question of how this would affect climate change? Wouldn’t the least materially well off be the most likely ones to seek the refuge of a false pleasurable world? The people who use exponentially more resources than the rest of us, and are therefore causing the problem, would for the most part go on with their lives. And if they don’t, but the system is totally unchanged, we just have the option for the pleasurable suicide of simulated reality, others would take their place like they always have when the powerful die.


[deleted] t1_iv9xmvk wrote



apple_achia OP t1_iva1nvp wrote

That’d the thing though, definitionally it wouldn’t be reality. It would be meaningless. You wouldn’t be in a mansion, your body would still be here. Wasting away while your mind toils away, praying that nobody on the outside turns off the lights


[deleted] t1_iva4m12 wrote



apple_achia OP t1_iva5v5b wrote

Yes making yourself unable to die IS a big concern of life huh? And stripping away all of your defense mechanisms in favor of being a nothing but a very vulnerable brain does make that a bit more difficult.

Well I’d say primarily that in spite of your insistence on leaving behind the prison of the flesh, like it or not, your brain is made of meat and it uses some 20% of all the calories you take in. The idea that you should just lop off the bits that aren’t useful to make life more efficient takes it as a given that these parts are NOT useful, while in reality we have very few vestigial structures. Humans have bodies because brains don’t form let alone survive on their own. We have stomachs because chemical energy from plants and animals are an efficient and available way to gain energy without needing a society to build and maintain an entire power plant, a power plant mind you that when you’re in your little brain box, nobody will be watching. We have limbs because it’s dangerous to sit in one spot for your entire life, the emergence of animal life taught such a lesson to the plants and fungi. You’re talking about voluntarily neutering yourself so you can just sit about castrated from the physical world and pretend really hard that you’re happy, that you’re a god and not a clump of cells in a box.


[deleted] t1_ivaa7uc wrote



apple_achia OP t1_ivaafyo wrote

Have fun killing yourself by seceding into the pleasure machine


[deleted] t1_ivac3nv wrote



blueSGL t1_ivbqiey wrote

> Pain and pleasure are just the beginning.

we have such sights to show you. (I couldn't resist)


ThoughtSafe9928 t1_ivfuaw9 wrote

u/apple_achia when they die and realize they are actually currently in one of these simulated realities, completely unaware and realize that what they hold so dear to be their true body never actually existed in the first place


turnip_burrito t1_iv9rvi6 wrote

Some people actually do, believe it or not. I have asked some people whether they would and they said yes. I wouldn't personally choose to live out my life that way, but I don't think it's our place or our right to tell them they're wrong.

The crazy thing about people is they like different things. Wild, I know.


apple_achia OP t1_iv9s147 wrote

Have fun opting out of reality and becoming a blob of cells in a tube then pal

I should’ve accounted for the redditor bias there


turnip_burrito t1_iv9s7uh wrote

Don't shoot the messenger man.

You're being pretty snobbish for someone who lives a fairly artificial lifestyle yourself. True nature lovers would avoid modern economic systems, urban norms, electronics, movies, any music that's not just vocal singing. Unless you live in the woods as a hunter gatherer, you're building walls to remove yourself from nature. Why are you trying to hide away from the natural way of life, like these VR blob cells?


apple_achia OP t1_iv9si1y wrote

Hey I’m not the one advocating for stripping half of humanity of its body and family in favor of prodding it’s brain with electricity here, don’t pretend I’m the villain for calling you names. If you ever want this to be real, or expect it to be, you can expect a little bit of back lash pal


[deleted] t1_iv9sj70 wrote



apple_achia OP t1_iv9sv2d wrote

Because it’s a trick of the mind. Because you have an effect on the material world, and the meaning of our experiences isn’t just determined by the feeling they give us but their objective effect on reality around us. Why give up your real family, a real sunset, a real river, born of minuscule odds from the thermodynamic madness that is our universe, for a mere representation of one? Without the limitations of reality it’s all meaningless. You couldn’t ever be said to have experienced any of it. You probably couldn’t even react to these experiences in a realistic way, because you’d lose the bearing on reality that developed your senses in the first place. You may as well be in solitary confinement, or dead. You’d be raving mad within a year.


turnip_burrito t1_iv9t62y wrote

Meaning is actually entirely subjective. It depends completely on the individual. If they feel like something is meaningful, then to them it is, even if to you it is meaningless.

Like I don't give a shit about people who play speedruns of games for fun. To me it's meaningless. It's not the most productive way to spend time, to put it lightly. But to the people playing, and the other people watching who enjoy it, it has meaning. Same for soap operas, or kpop bands. To me it's boring as hell. But learning to live with the meaning others derive from it is important. It's part of what makes human experience so varied and interesting, and the human condition.


apple_achia OP t1_iv9tzyu wrote

Ok, what we feel to be meaningful is subjective, But your body, no matter how your mind constructs your experience of reality, does have an objective effect on the universe. And you’d be giving that up. As well as any chance of reproducing the arrangement of matter and interactions that make up what you define as “yourself,” which is itself a construct. You’d be seceding from reality in a novel and pleasurable method of assisted suicide. Sure, in your mind you’d be doing whatever gives you joy, building beautiful monuments, eating the finest food, falling in love with a simulated other, exercising your omnipotence, but in reality, to any other onlooker, you’d be wasting away, entirely impotent, and unable to affect anyone else’s experience of reality, which itself is where MOST people derive their meaning in some form or other.

Some of us would rather try and build something for the future generations to utilize. Or touch another consciousness in some way. Maybe make Something objectively useful for the propagation of future life. But if you don’t share such a sense of purpose, maybe it’s best for you to get in the solitary confinement experience machine and dream of a thousand years of pleasure till your body fades away.


turnip_burrito t1_iv9uiwy wrote

Technically some electrons or something in the matrix would be shifted around, and the power draw might change. But yes you'd have less of an impact physically on things around you. I don't think that's a great metric for importance/worthiness though. In the grand scheme of things, the universe is too big and all our ripples will fade into physical insignificance, undetectable by those in the future. Yes you will have made a ripple, but no one will be able to tell.

I personally find nature interesting so I'd like to learn more about it, and observe it. The real world has meaning to me in that way. But I understand if others don't. We'll all have the same impact in the end, might as well enjoy the time we have in a way true to ourselves.

Also, I'd be sad to see people live their lives as solitary existences, in the real world or virtual reality. In both cases I'd hope they spend time and experiences with other people they care about. I can only hope, though.


apple_achia OP t1_iv9vbnx wrote

No we won’t. Because some of us will affect others more, and in THAT way send ripples through the universe. Personally I’d think if you have any connection to nature, you’d never consider getting in an experience machine, because it itself is the opposite of nature, it functions to cleave all of your experience from nature. I also draw meaning from nature, and I believe most people do in some way, which is why I have a hard time believing this would be a functional solution to anything. Functionally what you would be doing is providing a humane and enjoyable form of euthanasia as a solution to the climate crisis, and hoping enough people opt out to change our carbon impact on the world and avert climate catastrophe.

I’d say another problem with that is that the people causing the problem most directly, ie those with power who use exponentially more resources than the rest of us, would be the least likely to take it. And if the poorest billion take this option, but were living off next to no carbon any way, no impact would be made


turnip_burrito t1_iv9wn3o wrote

First, I agree it would be sad to watch people isolate until the end of time in VR by themselves.

I was also working off the assumption that this kind of technology is built after some sort of superintelligent AI is. It's really the only scenario where such a VR situation makes sense to discuss. There's absolutely no way it can be built beforehand. And such a super AI would, if it doesn't slaughter the human race, have the capacity to solve the climate crisis.

If such a thing were invented before climate change and AI is solved... somehow.... then yes that would be a threat humanity's survival. The equivalent of a man quitting his job and living off savings until he loses his marriage, kids, house, and food.

The way forward, after this, for any human beings that want to continue to make an impact on the world at large, is I believe to choose the kind of world in which they want to live. All kinds can coexist.

Some will stay normal human beings, which is perfectly fine. This group can spend time doing things in the real world with friends and family.

Some may jump in and out of virtual reality. It doesn't have to be by themselves. They can experience the universe as it is in base reality, or extend their experience to new ones not present in base reality.

Some who want to continue research and development to augment their capabilities. They'd have to become superintelligent themselves in order to continue aiding humanity's technological progress. Then they can match the machines' speed.

Others will do some weird mix of things beyond imagining.

At all points, there will be some who are more prone to isolation than others.

There are and will be options for all people to make a meaningful emotional impact in others lives if we choose. We just have to want it.


apple_achia OP t1_iv9xy1a wrote

As for AGI having the capacity to solve the climate crisis: I think this assumes we don’t understand what the solution to the problem is. That’s not the problem, the problem is coordinating actions across human beings to ensure our agency isn’t entirely neutered, we live a comfortable life, and we don’t use up all of the resources our existence depends on. AGI solving this would rely on it coordinating human actions in some way, this would by nature have to be coerced.

If AGI solves the climate crisis, it will be our King, and do so by coordinating our supply chains and economic activity.


turnip_burrito t1_iv9yaau wrote

Yes, that's correct. Another (less likely?) scenario is an AGI completely controlled by people, with no actual, or very limited, AGI autonomy. In that case we could use it to accelerate technological progress to make the things you listed easier.


apple_achia OP t1_iv9ycj0 wrote

for those who believe AGI will solve the climate crisis: we already know the problem is excessive fossil fuel consumption and resource extraction. Are you suggesting that AGI would coordinate human economic activity to prevent climate change in some way? Perhaps a way that would limit resource consumption to a sustainable level and assure a relatively equitable distribution of wealth and agency?

How would this AGI police the boundaries set? Or prevent someone from opening up an extra oil drill, or clear cutting a vital piece of forest or wetlands? Would it have the power to tell people to stop reproducing because there are too many humans to live sustainably on a piece of land? Are humans able to resist these orders if they find them to be unjust? Would they be coerced by the threat of violence either by AGI run robotics or human soldiers? Would the monopoly on violence and coordination of economic activity constitute an AGI run State?

We have material limits, so nuclear fusion would eliminate reliance on fossil fuels, but this technology wouldn’t solve something like clear cutting of land for agricultural land. And if you could increase efficiency of such a thing, you may see human population increase to the point where land is scarce. If this is solved, we may have issues with storing long term nuclear waste. To have AGI do anything more than kick the can down the road for more people to make decisions with how to deal with these problems, you’d have to be advocating for some sort of centrally planned AGI society. Or am I missing something?


EulersApprentice t1_ivaldx9 wrote

>To have AGI do anything more than kick the can down the road for more people to make decisions with how to deal with these problems, you’d have to be advocating for some sort of centrally planned AGI society. Or am I missing something?

What you're missing is the fact that the presence of AGI implies a centrally planned AGI society, assuming humans survive the advent. AGI is likely to quickly become much, much smarter than humans, and from there it would have little trouble subtly manipulating humans to do its bidding. So human endeavors are kind of bent to match the AGI's volition whether we like it or not.


justowen4 t1_ivahguf wrote

There is a nearly limitless amount of innovation potential in biochemistry that AIs like AlphaFold are specifically good at. Ecological problems are biochemical problems, and the reason we can’t figure out bacteria and enzymes to rectify our polluted biological systems (from the boreal forest to gut microbiomes) is that traditional computing can’t calculate the complex simulations to find solutions. The next step is big pharma throwing billions into drug simulations via AI, and then we will have built the intelligence needed to determine ecological adjuncts to clean up polluted environments. Humans have tried with mixed success to adjust biological systems but it will take a super smart simulator to find solutions that don’t backfire.


Surur t1_ivab1n0 wrote

AGI will enable technological solutions that is too labour intensive currently e.g. creating solar panels for the cost of the material (basically sand), launch mirrors into space, seed the ocean with iron etc.

All that can be done without international cooperation.


JustAnotherBAMF t1_ivac97h wrote

Why mirrors in space and seeding the ocean with iron? What do both of those do?


Devoun t1_ivae7ts wrote

Mirror in space is to reflect sun flight away from earth. = less heat

Iron seeding is meant to hyperstimulate plankton growth which in turn will capture way more co2 out of the atmosphere


red75prime t1_ivafssn wrote

Population growth: education is the best contraceptive, and AGI can immensely improve the educational system.

Fossil fuels: if you have a fully automated synthetic fuel factory that needs sunlight, water, air, a bit of materials for robot maintenance, and a carbon tax in place, you will outcompete automatic fossil fuel extractors. The green will probably go mad over the perspective of disrupting fragile desert ecosystems and returning brine to the oceans on unprecedented levels, but you win some, you lose some.

Resource extraction: the same thing, recycling is not profitable and maybe even not ecologically beneficial right now (you need energy, that mostly comes from fossil fuels, to process all that stuff). AGI can change that by providing negative carbon energy (and brains) to sort and process it.

Ecology: it will probably suffer for some time. Delays in UBI introduction will push more people into subsistence farming.

Nuclear waste: deep geological storage is not "kicking the can down the road". After 200-300 years the waste will be not much more harmful than natural uranium deposits and it will be a useful source of radioactive elements.


cwallen t1_ivawo5h wrote

Agree on population. That richer nations tend to have declining birth rates contradicts the idea that increased resource availability would lead to population growth.


green_meklar t1_ivb1sgp wrote

>we already know the problem is excessive fossil fuel consumption and resource extraction.

The fossil fuels are running out and becoming increasingly expensive to extract. Yes, burning them is bad for the environment, but there's a limit to how much we can dig up and burn.

At any rate, just because that's the cause of the problem doesn't mean the solution necessarily involves targeting that cause. We should, of course; we ought to tax air pollution and thus push incentives against more extraction and in favor of developing alternative energy sources. But as far as actually keeping the Earth cool, an easier solution might just be putting a bunch of shades in space to block sunlight, or growing reflective algae in the ocean to increase the Earth's albedo, or something like that. That doesn't even require super AI, although super AI might do those things anyway.

>Are you suggesting that AGI would coordinate human economic activity to prevent climate change in some way?

For the most part I would expect it to replace human economic activity.

>Are humans able to resist these orders if they find them to be unjust?

If the super AI decided that we couldn't, we probably couldn't. (Unless we augment ourselves to become superintelligent, which we probably will, but it's not clear how long that will take, and at any rate it boils down to the same thing.)

However, I suspect that super AI wouldn't need to use all that much direct force to influence human behavior. It could just make subtle changes throughout our economy that push us in the right direction while believing that we're still in control and patting ourselves on the back for success we didn't really earn (other than by building the super AI, which is the important part). It likely wouldn't care much about social recognition for solving the problem as long as the problem gets solved.

>this technology wouldn’t solve something like clear cutting of land for agricultural land.

We could make far more efficient use of land if we had the right infrastructure to do so. Even just transitioning from livestock to vat-grown meat (which doesn't require super AI at all, just plain old human engineering) would cut way back on our damage to wilderness areas. The damage we cause to our environment isn't purely a result of either overpopulation or bad management, but a combination of both.

>If this is solved, we may have issues with storing long term nuclear waste.

Nah. The radioactive waste storage problem isn't that hard and would become even easier with a super AI managing things. Also, fusion power creates way less hazardous radioactive waste than fission power.

>you’d have to be advocating for some sort of centrally planned AGI society.

It doesn't even need to be centrally planned, for the most part. Responsible decentralized planning would work pretty well- in many cases better. The main problem we have now isn't lack of centralization, it's lack of responsibility.


TheDividendReport t1_ivameff wrote

Not quite a relevant answer but the pace of technological progress juxtaposed with the pace of the climate crisis makes me feel pretty gaslit.

Like I’m gaslit by reality. Out of the billions of years that I could be a sentient thing, the thousands of years I could be a human, I wind up in this time? Really? This has to be a simulation. I chose the most interesting time in history to be alive. That almost feels closer to occams razor then the alternative


Surur t1_ivary3g wrote

> Out of the billions of years that I could be a sentient thing, the thousands of years I could be a human, I wind up in this time?

Well, that is a sign that humanity wont be around for very long. If you look at the graph, you are much more likely to be in the thick part of the graph than the thinner areas in the past when humanity was only a few million people.

If humanity has a great future ahead of them, you would likely have been born when we were trillions, but it seems much more likely that you are born when we are billions, and this is the most humanity will ever achieve.


TheDividendReport t1_ivat7ss wrote

… fuck. Okay, but this can still be true. I chose the most interesting time to be alive: the apocalypse. Death will be taking off the VR headset and returning to the time of me being one of trillions.

Excuse me while I go inhale some more copium.


Saerain t1_ivdt6v2 wrote

Here I thought you were gonna make the point that even total ice loss means sea level rise an order of magnitude more distant in time than these technologies.


red75prime t1_ive3lne wrote

I suspect that there's something wrong with the idea that I'm randomly chosen from a pool of all sentient beings. I can't express the problem clearly, but it looks like that the idea requires existence of supernatural "essence of me" that could have been instantiated in other sentient being, while that being has nothing in common with me (beside sentience).


TheDividendReport t1_iveloao wrote

Not just sentience, DNA. You could also subscribe to the more “woo” areas of panpsychism and believe that all consciousness stems from one source. Perhaps that source is literally seeking experience from all simulations of experience. It could be a technological simulation. It could be a spiritual simulation.


red75prime t1_iverg6i wrote

> the thousands of years I could be a human, I wind up in this time?

It can be continued. The decades I could ponder those questions. The minutes I could observe this date and time on a calendar. And so on. Reference class problem.

Going the other direction, if you disentangle consciousness from everything that links it to whatever you observe now, it would be equally present in every conscious being, so the question "why it is present in me?" loses surprise. It is present wherever whenever, so in me too, no biggie.


sniperjack t1_ivaac4l wrote

I think we will see stratosphere injection with sulfur before we hit 1.5 degree. It seem pretty obvious that we will not get to 0 emission before 2030 so this is what will be done or something similar. The risk are kinda know since we had big volcanic eruptions even though not on a long timescale. Imagine a world at 1.5 or more and think to your self if geoengineering isnt something you would want done? Then hopefully AI can resolve the pressing environmental issue. I am a bit more afraid of how weirder and weirder people seem to become and how social engineering seem to become way more efficient at creating narrative and subjugating critical thinking.


rushmc1 t1_iv9wras wrote

Tends to be a lot of hand-waving and wishful thinking, from what I've seen.


Cuissonbake t1_iv9yvtw wrote

Life is just one long dream. Currently feels like a nightmare but all dreams end. How do we know this is real when no consent was given to be born into endless pain.


apple_achia OP t1_iva1ade wrote

Alright. If you’re just going to counter anything I say with nihilism and antinatalism, this conversation isn’t going to go anywhere philosophically interesting. I hope your waking nightmare someday finds purpose


Cuissonbake t1_iva26qk wrote

I'm not a nihilist. I do believe it'll get better. What I'm trying to say is that we only live so long and no one can see past a decade so take life with a grain of salt.


green_meklar t1_ivb01lc wrote

Anthropogenic climate change is just not an existential risk by itself. It threatens to kill or displace some hundreds of millions of people, mostly in the tropics, but civilization as a whole will have no trouble maintaining progress through it. (Unless we respond to it by doing something even more destructive, like a nuclear apocalypse.)


Jamie1897 t1_ivcvvd4 wrote

The catastrophic visions keep coming, and yet the standard of living and human lifespan keep rising, especially in the developing word. The recent environmental campaigns titled "keep the X (coal, oil, and most recently, natural gas) in the ground" are an admission that technology has increased our recoverable fuel reserves so much that the original predictions of resource depletion is now now longer a concern for the foreseeable future. And whenever people want to get serious about clean energy development, billions of tons of Uranium and Thorium lay waiting. The few things that genuinely could hamper the rate of technological progress are inflation and precipitous energy cost increases, both of which raise the cost of research and development, and production of high tech devices. But the current price increases aren't related to resource depletion.


Verzingetorix t1_ivazi24 wrote

Climate change doesn't mean Michael Bay disaster movie.

The weather patterns will continue to shift. Some small underdeveloped nations will struggle. Although it's not their fault, it is their problem non the less.

Economically resilient, developed nations will manage, even if some of their citizens will be temporarily impacted. Just like they have been for decades.

People and society will go on. That being said, humans where never meant to be a permanent fixture of the ecosystem anyway. So it doesn't really matter if they don't.

What I'm wondering is how many mass migration events will take place and what impact it will have. That's where the problems will come from. People wanting the safety that others will have. Will people's kill count be greater than the climate's?


cypherl t1_ivcxamp wrote

Climate kills 1000x fewer people than even as recently as the 1950's. Turns out central air and super computing weather forecasts are pretty effective. I agree with you, it would be a people thing.


EscapeVelocity83 t1_ivcixhk wrote

Poor people will be displaced mostly rich people already have like 5 places they live. It's all just rich people telling the poor's to do all the work and absorb all the negative consequences


MrSmileyHat69 t1_iva7wd1 wrote

The common formula for human ecological impact are three variables: population, wealth, technology. With ai and automation set to replace virtually every job, and now real alternative system put in place, the small number of individuals and corporations who control ai technology will just let the swathes of soon to be useless labor force simply starve and die out, thus reducing global warming…


Cult_of_Chad t1_ivb15fa wrote

We could just engineer ourselves genetically to thrive in the new climate, it's not like Earth is going to become hell itself.

Also, most of us in wealthy countries will be fine. Specially north America.


cypherl t1_ivcwn07 wrote

People don't seem to understand this. We are at 440 ppm CO2. Mr. Trex lived quite happily at 1500 ppm.


Cult_of_Chad t1_ivczaxj wrote

Climate doomers left all reason behind years ago. It's become like a suicide cult for misanthropes.


karearearea t1_ivbfj3y wrote

There are 8 billion odd people alive today, and how many of them are involved in cutting edge research, pushing the frontier of what we think is possible? Very few. As long as climate change doesn’t literally flood the top universities in the U.S., Europe, and Asia, then our best and brightest can keep working on whatever it is they’re working on.


botfiddler t1_ivkoypp wrote

I assume, you are just trying to push your politics down into everyone's throat. There are most likely going to be people having energy and other ressources. Also, technology isn't obligated to solve political problems. It might solve it, or maybe it doesn't. Maybe the coming crisis will solve the problems by removing some elites, and not necessarily "the billionaires". People might even not agree, that there is a political problem. Just because you claim there's something to solve, doesn't mean everyone has to agree, or agree with the way you want to solve it.

We have enough for everyone and poverty

  • and we share
  • or we don't

We don't have enough for everyone plus technology

  • and we share
  • or we don't

We have enough for everyone plus technology

  • and we share
  • or we don't

It's a political decision, and it's futile to discuss these theoretical scenarios with hostile actors.


EntireContext t1_iva8777 wrote

It's not a hard problem for AGI to capture the "excess" CO2 in the atmosphere.

Also, it's unpopular to say because "The Science" says different, but there is actually no climate crisis.


Spaceboy779 t1_ivajp7t wrote

'Unpopular' is a real word, 'Impopular' is not, that's what the red line underneath means.


EntireContext t1_ivbo8c9 wrote

You're right! I corrected it. There is still no climate crisis though.