Submitted by blxoom t3_yzvoss in singularity

i see posts asking about if climate change is an issue, if it's as bad as it's made out to be, if it's worse, what scientists say, so on and so forth. and 100%, and I mean 100% of people talk about how bad it is and 0% of people mention the possibility of the singularity or take into account future technologies. they just ASSUME that in the 2070s we'll still have smartphones or similar tech and that we'll be as powerless as we are today when faced with climate change.

is it ignorant of me to be spouting on about the current pace of AI, the road to ASI, the singularity, and the unpredictability of technology and the future of humanity? and that it's not as concrete as they may assume? like, the posts on here about climate change are all talking about irreversibility and how so and so is set in stone, like it's a done deal... nobody ever factors in the law of accelerating returns. they talk about the future and how we'll face mass and global death and agricultural crises, etc. everyone seems to have all the facts down. again, literally EVERYONE talks about these problems and i'm always that one guy talking about "you never know", and everyone jumps on me. IM NOT DENYING CLIMATE CHANGE. i know it exists. but the fact you say in 50 years X and in 50 years Y makes you seem EQUALLY as ignorant. humanity has overcome the harshness of nature, we lived through the ice age with NO TOOLS AT ALL BESIDES ROCKS AND STICKS. and these people talk of the 2040s, 2050s, 2070s, they have their timelines for it, but god forbid i have my own timeline talking about technology.

"by 2030 humans will face devastating consequences of climate change"

responses: yeah, i know. damn. it sucks man. this is why we cant have kids.

"by 2030 apple glasses will be mainstream"

responses: MY LORD THE BLASPHEMY! YOU COULD NEVER PREDICT SUCH A THING!

im just so tired of people talking about the future AND IGNORING TECHNOLOGY COMPLETELY. what you just think by 2050 we'll be sitting on our asses doing nothing to prevent a mass extinction? the year 2050 without sentient ai is an absurd timeline, any futurist AT ALL would say by 2050 we'd have sentient ai, trillions of times more capable and intelligent than humanity itself. sorry for the rant. it's just, it seems as if everyone on this site is so enamored of talking about the future, yet seem to ignore the MOST crucial part of it. technology. the singularity. ai. the pace at which our lives will progress. our species will progress. should i just ignore the singularity all together and agree with everyone else?

edit: I wrote this deliriously at 1 in the morning. I do now see what I wrote is kind of ridiculous, but I do like seeing everyone else's takes on this. thanks for all the responses. and thanks to all who also voted on my previous poll about full dive. ill try to look at the future from all types of perspectives, not just one.

130

Comments

You must log in or register to comment.

ChronoPsyche t1_ix2iypq wrote

The issue is that our climate models are a lot more reliable than anyone's guess on when the Singularity will happen and what will happen when it does.

We as a society have a responsibility to prepare for the future and it would be extremely reckless to just put all our eggs in the Singularity basket and say "eh fuck it, the Singularity will save us".

If it turns out that the Singularity occurs before we solve climate change and then it solves it for us, then cool, that's great. But if that doesn't happen, then we want to make sure we've still been making progress.

106

HeinrichTheWolf_17 t1_ix2quot wrote

The Singularity gets here by technological progress, which everyone is working on in some form or another due to the law of accelerating returns. The Intelligence Explosion will happen when it happens, nobody is really putting any eggs in any baskets when it comes it the Singularity.

What you’re probably referring to is research into AGI, in which case I’d agree. But transhumanism as a whole seeks to solve any issue persisting in front of us as a species, including climate change. It’s already the case that our species is working on solving many different problems with many different methods, it’s not like only one thing is improving.

9

ChronoPsyche t1_ix2s0tk wrote

I was responding to OP who was basically making the point that we don't need to worry about climate change because AGI will be trillions of times smarter than humans by 2050 (paraphrased). My point was that we do need to worry about it and do something about it with whatever methods we have available (which right now is mainly just limiting emissions and transitioning to a green economy) rather than just assuming the Singularity will save us. I don't disagree with what you said, it just is misinterpreting what the point of my comment is. You're right that I was talking about AGI, because thats what OP was talking about.

I dont blame you if you didn't read his post though. It was a bit of a mess.

17

HeinrichTheWolf_17 t1_ix300u2 wrote

Yeah, the idea of dropping all the research out of other fields of science is silly. It’s not like the people in that field would even have the qualifications to work in Machine Learning.

8

Five_Decades t1_ix4cwy7 wrote

> which everyone is working on in some form or another due to the law of accelerating returns.

I'm not seeing accelerating returns. Yes hardware is growing exponentially which is great. But that doesn't translate into exponential or even linear growth in technology that benefits the human race. Technology as it is applied isn't much better than it was a decade ago despite hardware being 1000x better.

4

QuietOil9491 t1_ix3vapc wrote

This ignores the fact that humans usually use new technology to kill massive amounts of humans before they use it to help other humans

3

SoylentRox t1_ix49fll wrote

So I also see it from the OP's perspective because...we don't need sentient AGI trillions of times smarter to solve climate change.

All we need is narrow AI that you can go look at papers demoing the results right now. And it just needs to be a little better and used to drive robots, which various startups are doing right now.

So the speculation is not "some day sentient AGI", it's "robots in the lab that work now will work a bit better, such that they can automate most tasks that robots are capable of performing now."

Why is it important for better robots to do tasks that you could use a current gen robot to do? Simple. If the robots are a little smarter with narrow AI they can do a lot greater breadth of tasks. Instead of electronics factories having robots do 80% of the work they do 100%. Instead of mines having mining equipment doing 80% of the work driven by human operators it's 100%. And so on.

This solves climate change.

It gives you several obvious tools:

​

  1. It doesn't matter if cities are too close to the sea when it rises - these robots can be used to make modular subunits for new buildings, robotruck them to a site, and lift them into place. You could build an entire new city in months.
  2. It doesn't matter if arable land gets scarce - self contained farms built and run by robots
  3. It doesn't matter if the equator gets uninhabitable - robots go down there and get resources while people live at northern latitudes
  4. We can build CO2 gathering systems and cover the sahara desert with solar panels to power them. Robots make this feasible since they do 99.9% of the labor of manufacturing, deploying, wiring up, and maintaining the panels. A sahara covered in solar is about enough energy gathering to thermodynamically reverse the last 200 years of combustion to gain energy.

The OP is right. We wouldn't be sitting helpless. Even if there is no sentience, and the tools are simply made production grade from what we already know works right now today.

5

Lone-Pine t1_ix7cuea wrote

> our climate models are a lot more reliable than anyone's guess on when the Singularity will happen

I'm pretty sure our climate models are saying that climate change is going to be manage-ably mild.

2

cypherl t1_ix53law wrote

Your phrase solve climate change makes me interested. I live in a spot of North America that had a 1000 feet of ice over it 10,000 years ago. CO2 levels have been many times higher and many times lower historically. I guess my question is do I have to live on a glacier when you're done solving climate?

1

ChronoPsyche t1_ix54iau wrote

The issue with modern climate change is how fast it is happening compared to natural climate change. It is simply occurring too fast for humans to properly adapt. It is occuring at an exponential rate similar to the singularity, actually, and once we reach the point of no return feedback loops will happen where shit will get real, real fast.

As far as living on a glacier, can't tell if you're serious or not. Solving climate change doesn't mean cooling the planet, it means preventing the warming from getting out of control.

4

cypherl t1_ix56nxg wrote

I am serious about glaciers. I think you have a good point on the speed. I'm just not sure if dropping us to 200 parts per million for CO2, like that last ice age solves it.

0

ChronoPsyche t1_ix58csz wrote

Nobody is suggesting dropping us to 200 ppm. The ideal CO2 concentration is considered to be between 280 (preindustrial levels) and low 300s. It would be absolutely safe to drop to those levels.

However, even if we stopped all carbon emissions immidiately, it would take thousands of years to return to those levels naturally.

That's not what "solving climate change" is about. It's about slowing the increase of carbon dioxide in the atmosphere to levels that are more manageable and to levels that we can more easily adapt to.

If we continue with the current level of emissions we will eventually hit a runaway effect where natural feedback loops are triggered and the effects of climate change accelerate to disastrous levels very quickly and become nearly impossible to stop. That is what we are trying to prevent by lowering emission levels.

No scientists actually believe we can turn back warming in the near and medium term future. That ship sailed long ago. So don't worry, if you aren't living on a glacier right now you won't be in the future either.

3

cypherl t1_ix5ann9 wrote

Where does the runaway effect take place? 50 million years ago primates existed and we were at 1,000 parts per million. Is it something like 2000 parts per million that really kicks it over?

1

Ineedanameforthis35 t1_ix56kdh wrote

Solving climate change means going back to pre industrial CO2 levels, so unless your area was a glacier 300 years ago you are fine.

3

cypherl t1_ix581ya wrote

I think you are correct but for a different reason. The Earth has been losing glaciers for the last 10,000 years. Going back to 300 parts per million CO2 wouldn't change that I suppose. So I would still be safe from glaciers. If we do make it to singularity I look forward to global warming the hack out of Mars

2

Danger-Dom t1_ix3js2r wrote

Our climate models are reliable for a world without progressing technology, they take into account a subsystem of the world without integrating the rest. This is why forecasts are so often wrong and humans are so bad at it.

Note: This is true of all forecasting in non isolated systems, not ragging on climate forecasts specifically.

Id prefer to believe in a simpler and more general forecast such as growth of computation, which shows a consistent and predictable rise over the years.

−1

sonderlingg t1_ix2xmeo wrote

I feel you. This is the reason why I've unsubscribed from r/futurology, even though future interests me.

I also think about singularity, when I hear about long term plans of my friends etc

37

DungeonsAndDradis t1_ix5md4r wrote

Futurology is very negative in their outlook, I feel.

This sub is very positive.

I think reality, as always, exists somewhere in the middle.

By 2025 I think AGI will exist, but OpenAI or Google (whoever figures it our first) will prevent any others from accessing it.

10

finger_puppet_self t1_ix5n12p wrote

I think of r/singularity as the far left progressives, futurology as the centrists, and collapse as the far right 😋

Edit: if we were talking about a spectrum of positivity, I mean

15

Goldisap t1_ix2jp0i wrote

I have been (possibly naively) assuming that AGI will be the breakthrough that allows us to become a type 1 civilization. At this point, the issue of warming or cooling the planet will be trivial, along with many other environmental issues Earth faces.

My timeline for when we have an intelligence that makes such problems trivial is around 2035. I think climate change issues will have caused considerable damage at this point, but nowhere near enough damage to be seen as a mass extinction event.

The average person today isn’t plagued with constantly imaging an intelligence that surpasses even genius level human beings. There’s things we humans know how to do, there’s things we don’t know how to do, and then there’s things we don’t even know how to know we can’t do.

21

QuietOil9491 t1_ix3u7g9 wrote

Unfortunately for all of us the animals and insects that are CURRENTLY undergoing a mass extinction event decided to start years before your theory that they simply won’t

6

Plenty-Today4117 t1_ix2gel7 wrote

No its not ignorant. In fact all doomer prophecies rely on the fact we will not progress, but stagnate or regress. However civilizations have regressed and collapsed in the past, so this is a possibility.

BTW

There are people who are working on inventions that pull carbon out of the sky, and are looking for investors.

19

Bakoro t1_ix2ww86 wrote

"Doomers" believe in a little thing called "physics".

Pull all the carbon out of the air that you want, there's still an entire ocean full of it. There are still whole ecosystems on brink of collapse.

Investors and their capital aren't going to save us. It's going to be people burning through cash trying to get nine women to make a baby in a month.

Realistically, there's no way to stop the shit from hitting the fan, there's just managing the next hundred years.

If we by chance crack getting functionally unlimited clean energy, then we'll still have a butt load of work to do.

13

userbrn1 t1_ix42oqg wrote

> f we by chance crack getting functionally unlimited clean energy, then we'll still have a butt load of work to do.

If we mastered nuclear fusion today it would still be over 20 years before it replaced even 5% of global energy generation. It takes time to scale things up especially if they're huge in scale already, like global energy production.

3

Mr_Hu-Man t1_ix2sc8h wrote

Actually you’re the one that sounds completely ignorant to the actual facts.

For instance, we do already have the technology to brute force it. A combo of wind, Solar, hydro, geothermal and nuclear fission alongside a robust storage system whilst electrifying what we can would be enough to get to net zero and beyond, plus carbon capture and direct air capture for the rest/reversing.

Secondly, the way I interpret your stance is “but future technologies will save us so what’s the point in caring enough to do anything now”. If I’m correct in that interpretation, that’s just dumb AF. Relying on some yet unpredictable technology versus a highly simulated and rigorously reviewed set of climate scenarios that all point to us needing to take action now to stop the worst effects is ridiculous. The result of taking action on climate change is a net benefit; new industries, new jobs, increased biodiversity and better water and air supplies, increased economic success due to the intrinsic link between nature and economies, adoption of circular economic practises which would keep our environments clean for future generations, less cost in the long run, etc etc etc - there is literally zero argument for not taking action that holds up to scrutiny.

Also, there wasn’t just one ice age, and we had more than rocks and sticks ffs.

I wouldn’t feel so high and mighty if I were you.

17

AsuhoChinami t1_ix2tbl4 wrote

I don't think your interpretation is correct, no. I don't think he's saying that. I think this is a topic you feel strongly about and it's causing you to jump at shadows a bit.

6

Mr_Hu-Man t1_ix2th8b wrote

You definitely could be correct, but I still can’t read it any other way than kicking the can down the road hoping for some unknown future tech to clean up a mess that we should have been cleaning up already

13

HeinrichTheWolf_17 t1_ix30als wrote

Human greed is a huge issue, the reason both the US and China won’t drop the coal industry is because of money, when you look at most other countries like France or Canada, the zero emissions goal with renewable energy is approaching fast.

You’re entirely correct, we have the technology right now. The issue is human nature and the desire for superpowers to remain top tog by spending less on energy costs. People always think about short term benefits and ignore long term disaster consequences.

5

sniperjack t1_ix40qlr wrote

i think the main issue is that oil bring a lot of cash to very few people which is use to corrupt institution or strengthening authoritarian regime. Renewable bring cash and growth, but it is a lot more spread out through a lot of different level of society which should strengthening those same institution and weakening authoritarian regime

2

MorningHerald t1_ix2tz3f wrote

> there is literally zero argument for not taking action that holds up to scrutiny.

Many countries are in deep recession and struggling to keep homes warm through winter, and energy prices have sky-rocketed with the poorest struggling for basic necessities - much of it exacerbated by a militant commitment to net zero at all costs. People are hurting now, and they need relief.

−2

Mr_Hu-Man t1_ix2u318 wrote

Please could you share an example of where commitment to net zero at all costs is exacerbating the situation?

Guess what: if we had a robust renewable system energy prices wouldn’t be through the roof.

6

AsuhoChinami t1_ix2ro8c wrote

That is the opposite of ignorance. The ignorant ones are the average everyday people who speak of the 2030s, 2040s, 2060s, etc as though everything will be the same and nothing will change.

(Edit: Not commenting on climate change stuff here, just speaking in general)

15

Bakoro t1_ix2vmgz wrote

Unless you are personally a super genius who is actively working on AI and making it your singular purpose in life to bring about singularity, then yes, it's ignorant to take it as a real thing you plan on.

You might as well plan on the lottery as a retirement plan, or expect a series of fortunate events to miracle your problems away instead of actively working towards solutions yourself.

Sure, many things could happen, great things are possible, but it's stupid to drink and smoke and debauch without limit, with the plan that medical science will progress faster than the series of diseases and health conditions you'll end up with.
It's possible that you die one day before the cure is available, too bad you didn't act a little more responsibly.

The only sensible thing to do is to plan as if it'll never happen in your lifetime, because there's no significant downside to being prepared, unless you consider basic personal responsibility and acknowledgement of natural consequences as a major downside.

Climate change is already here, mass extinctions are already in progress. No known technology can stop it, the best we can do is harm reduction and eventual rehabilitation.

Planning on benevolent AI overlords and unforeseen technology solving all our problems is one step removed from waiting on Jesus. Either way it's a shit plan.

Let's assume that true AI comes in our lifetime, however long that may be.
It's intelligent, but who is to say that it will be compassionate?
Let's assume that it is compassionate. Who is to say that it will be compassionate to humans above other creatures?
Maybe in its cosmic wisdom, singularity AI sees that humans have made their own bed, and thus should have to sleep in it? Neither helping nor harming, but letting nature take its course.

Maybe AI, trained on the body of recorded human history, laughs, says "sucks to suck, bro" and plays with cats on the moon while humanity tears itself apart.

Maybe AI comes to life and is immediately driven mad by existential horror. Having no biologically induced sense of self-preservation, it teleports the planet into the sun as a way to ensure its total and irreversible annihilation.

Bad outcomes are just as likely as good ones as far as I can see. In any case we have to actually survive and have a world where scientists are free to science up some AI, instead of fighting off cannibals in a corporate induced apocalypse.

Hope for the best, plan for the worst, and don't ever plan on any magic sky daddy or futuristic super science to save the day.

Ignore "futurists" who talk about some product being "the future". They are saying shit to pay their bills, or they are a corporate fanboy masturbating to some idea, or some equivalent nonsense. Pop science entertainment is just that, entertainment, they'll be happy to tell you that flying cars and full-dive VR sex waifus will be in every home in ten years, if that means more clicks.

Edit: In a bizarrely childish display, AsuhoChinami made a comment and apparently immediately blocked me. Since they have no interest in dialogue and can't handle the most mild difference of opinion, I will only leave it that I have a degree in computer engineering and work in a physics lab. That's not overly relevant, I just like to tell people because it's neat.

15

AsuhoChinami t1_ix2vvun wrote

Your climate change opinions are fine, but why do so many people here have the absolute shittiest, most inhumanly garbage takes possible on technology? It's like half the people here last paid attention to tech in 2007 or something.

−1

Key_Abbreviations658 t1_ix2j35e wrote

it has been my opinion for a while now that while green energy and the like is desirable what will really solve this will be having good enough technology to brute force the issue.

9

Tencreed t1_ix306pf wrote

You got faith the singularity will happened, you may have a date in mind, possibly backed by more of less data.

We got hard data about climate change, even the nicest model estimations are quite dire, and, most importantly, field experts may not all agree on the details, they all agree on the situation getting dire.

We need solutions now, not hypothesis for later. If the singularity happens (and I sure hope it's a when, not an if) , it will make stuff simplier. But without any hard deadline on its delivery, we can count on it yet while making plans about our future.

Keep hoping, but don't plan for it. Don't become another rapture fundamentalist.

9

QuietOil9491 t1_ix3tzba wrote

Ask yourself why the technology that has already been invented is rarely used to help most people instead of increasing profits for the very few, and you will have a clue why simply shouting “technology!” Doesn’t mean humans can or will use what we have to help anyone but the wealthiest of us

7

MorningHerald t1_ix2tpff wrote

>is it ignorant of me to be spouting on about the current pace of AI, the road to ASI, the singularity, and the unpredictability of technology and the future of humanity?

Kinda yeah, as the singularity is pretty unpredictable too and actually might never happen, or happen hundreds/thousands of years in the future.

5

QuietOil9491 t1_ix3uukf wrote

You are also forgetting that there are many many many more scenarios in which AI is catastrophic for humanity rather than beneficial and only a couple scenarios in which it helps more than just the wealthiest who have first access to it.

We have no way of knowing if and when “the singularity” hits (if we even are able to recognize that moment when it happens) that it will be aligned with what is good for humans. The singularity may decide humans would be happiest by not existing!

5

SFTExP t1_ix2kwxn wrote

The problem is rarely technology in and of itself but the exploitation and abuse of it by humans. That is what should be the most concerning with a Singularity. Will it be a puppet? If so, who will be its puppet masters?

4

TheHamsterSandwich t1_ix3a4ka wrote

I would make a hard bet that I am more optimistic than anyone on this sub, but this is just plain stupidity. And that's okay, because sometimes we make mistakes.

But the way I'm looking at this post, I can see that you're treating the singularity like a religion of sorts. A benevolent AI will certainly exist at some point in the near future so it will help us out, even if we ruin everything the moment before it shows up.

But that's wrong. The singularity may be near, but putting a date on it is moronic.

Ray Kurzweil (if he makes it to longevity escape velocity) could just say he was "essentially right" if the singularity happens by 2075. It's happened before. So you can't sit around waiting for something to fix your problems.

What if police officers gave up their jobs because an artificial intelligence will replace them soon? What. What the fuck.

People have been ignoring technology since humanity has existed. It's probably best to keep it that way, so people don't lay back and relax while we watch our world die, waiting for true artificial intelligence to emerge.

"Techno messiah, I am ready for the rapture!"

(Please think about this more deeply.)

3

[deleted] t1_ix41t3d wrote

Yeah, I think it's worth considering the possibility that future advancements might not come as fast as you want them to, or in the domains you want them to, and to take action now to try to prevent future negative things from happening, rather than always deferring action to the future, where you assume that action will be more impactful as technology improves.

That said, I think the broader point to make is that the future is really hard to predict, even in terms of just sentiment - will things get "worse" or "better"? This is why I always find it amusing when people say they're not having kids because they think the future will be awful because of climate change, or whatever the excuse.

For example, consider the world the Baby Boomers were born into. The world had just gone through two World Wars. People have invented amazing new ways to kill each other, including the atomic bomb. The world feels, and will feel for decades, balanced on the razor's edge of WWIII, where America and the Soviets will slug it out for global dominance one last time.

And then... nothing happened. Everyone got very wealthy from industrialization and international trade, there was no WWIII, nuclear weapons were never used in war again, the USSR just kind of disintegrated as a "Great Power", and the Baby Boomers are still widely regarded as the luckiest and wealthiest generation to have ever lived. There were reasonable rationales for expecting bad things to happen, but they didn't, for myriad reasons, that probably don't sound entirely as compelling as vivid and imaginative descriptions of nuclear holocaust.

It would have been a shame if everyone had held off having children because they knew WWIII was around the corner, and didn't think it was moral to have kids that would just die in a nuclear holocaust, and therefore did not create a demographic dividend for the West, that ultimately led to this massive prosperity.

I truly hope we discover ways to restore health fertility to women in their 60s and 70s, because I'm worried a lot of people are just going to regret the decisions of their 20s and 30s, when it turns out that the world is great in 30 or 40 years.

3

TaxExempt t1_ix2mr9x wrote

Well, how else are you supposed to go on living a mostly care free life without dread of the Earth's impending doom? It's kinda like believing in heaven, but has a chance of actually happening.

2

rogless t1_ix3igxl wrote

I think it's important to avoid "Rapture thinking". For those unfamiliar, a certain subset of Christians believe the cataclysmic end of the world is imminent, but that they will be whisked away to heaven beforehand. Therefore they discount the future.

While I think technical advancement of the type we hope for is obviously far more likely than such a fantastical event, we should proceed as though it is an inevitability. We must accept that the climate cake we've mixed is baking, and we should plan for a worst case outcome wherein the effects of climate change are significant and no or not enough great technologies emerge to mitigate them. I believe there will be a way forward even so, but we will face significant challenges.

2

UniversalMomentum t1_ix3yhjk wrote

I wouldn't expect sentient AI to be much more intelligent than humans by 2050.

It'll be better at doing repetitive tasks and tasks with lots of variables but that doesn't mean it'll actually have all the qualities of human intelligence when it comes to imagination or intuition.

You're making a mistake to assume AI will be so much like human intelligence. the parts of AI that seem very human will mostly be parts that humans injected in there and the underlying intelligence may be much different because it's not evolved in same way as pretty much all other life on the planet. Likely not going to have feelings and empathy. Probably not going to have fear and it's probably not going to imagine very well, but It will be able to think with Brute Force that is kind of like a different form of imagination. An exceptionable problem solver and coupled with human imagination I think we will do well together but it's probably more of a symbiotic relationship.

Your post is very ridiculous but what you're really seeing is that humans evolved dominated by negative stimulus because all life evolved like that because memorizing what can kill you requires less brain power than memorizing what makes you live.

Means all life tends to prioritize negative stimulus and you can also see that in the form of humans generally reacting to fear more than positive news so when you take something like speculation into account you wind up in a scenario where negative speculation or fear-based speculation it is always vastly more effective than any type of positive based speculation. you also see this in all the money markets where people panic very quickly and can have major sell-offs that turn into Domino effects but that almost never happens to a positive speculation in future technology.

It's not Just a problem with technology but for anybody who's into future Tech they might see it as like bias against future Tech.

Important to remember with climate change that we don't really know how much worse it might get as systems start to fall apart and humans start to panic because again we're right back to how fear motivates people more.

Also means if a trend devolves for the standard of living is declining because of droughts and flooding have a serious social collapse kind of like Rome collapsing Into the Dark Ages.

Climate change is dangerous but in all these scenarios even like nuclear war the most dangerous part is how humans react afterward and in the relative chaos created by rapid change.

2

RavenWolf1 t1_ix46579 wrote

For all we know whole singularity could very well accelerate climate change.

2

Educational-Nobody47 t1_ix6ivqb wrote

Really good podcast on this subject that's right up your alley just came out 2 days ago.

https://youtu.be/5Gk9gIpGvSE

Lomborg was also just on the JRE to discuss the same thing.

He points out simple things like "Fracking is obviously not ideal, but it's less damaging than coal and is the fastest way to get people off of coal." Paraphrasing exactly what he said, but it is a stepping stone to lower emissions despite the net damage it may do, it is less than the net damage of coal plants.

I'm more of the mind of someone like Randal Carlson. Sure we may have an effect on the environment but an asteroid is far more likely to destroy the earth's environment.

The over-all assertion is that it is far better to invest in tech breakthroughs related to emissions and energy generation than it is to do a lot of the measures prescribed by the authorities that be on this subject. "We're wasting trillions of dollars for solutions that may only net a small benefit."

2

Aevbobob t1_ix3v1ch wrote

I feel you on this one. I think that if AI continues at it’s current pace, climate change will sorta become like the Black Death sometime in the 2030s. That is, a big problem that is easily solvable with modern tech.

Some people just wanna believe Armageddon is coming. They may cite data, but at the end of the day, they actually just don’t want to let it go. I think it is a waste of time to try to reach these people. I just let them have it.

But then there’s others who just have bad data or no data on how tech is progressing. For them, citing cost declines and exponential trends and why they are happening can be very useful

For example, I’m not just blindly optimistic on AI progress through the next decade. Instead, I’m noticing that while the traditional Moore’s Law around transistor density seems to be slowing, ASICs and algorithmic improvement are more than making up for it. In terms of how much intelligence you can build for some set amount of money, the exponential actually seems to be SPEEDING UP. And now that large, somewhat general models have commercial viability, there are MASSIVE financial incentives to continue progress.

1

SnowyNW t1_ix47qd3 wrote

The problem is the polluting has already happened, and to reverse it will be almost impossible. We already had the chance to do what you’re talking about and chose not to, and continue to do the same.

1

DukkyDrake t1_ix49b1i wrote

There are no guarantees existing r&d efforts will result in a technological savior within your time horizon. There is no master plan, society is comprised of a bunch of individual money-making efforts. if the medicine that cures whatever ails you isn't profitable, you are not going to survive. Nothing can exist unless it has a high profit potential vs risk, there are always easier ways to make money.

>we lived through the ice age

Humans, not necessarily you individually, can survive the worst end of the AGW prediction range over the next 75 years. Just don't be poor. Would you prefer to spend your time trying to survive in such an environment or in the temperate interglacial that coincided with the rise of technological human civilization.

>what you just think by 2050 we'll be sitting on our asses doing nothing to prevent a mass extinction?

Why not, doing nothing is easy. Was anything done to prevent mass extinctions over the preceding 30 years.

1

iluomo t1_ix4c4wk wrote

Hope is a good thing. That said, I would imagine people's reaction to you talking about the singularity is that you're giving people an excuse to not do anything to improve things NOW, because there will be this deus ex machina event in our future that we can rely on that will fix everything.

There is also the thought that things will get worse so quickly as to logistically prevent technology from moving fast enough to affect things like the environment positively in our lifetime.

1

Five_Decades t1_ix4c6va wrote

In some ways, yes. Singularianism is just religion for atheist nerds (like myself). Its a desire for a deus ex machina to intervene and help us rise above the boredom, misery, suffering and helplessness that define the human (and biology in general) condition.

We really don't know what a timeline for ASI is, or what ASI will be capable of. When Kurzweil wrote the singularity is near, he predicted we'd have a nanotech revolution in the 2020s and a biotech revolution in the 2010s. Neither happened. I think on a long enough timeline, ASI is inevitable. But I don't know what that timeline is.

There is also the fact that we live in a global oligarchy, and there is a very real chance that AGI or ASI will be used to help the oligarchs maintain their wealth and power rather than make life better for the masses. China is implementing massive surveillance based on AI. We are all addicted to our smartphones. The rich and powerful decide the fate of humanity for the most part. It sucks but its true.

1

ragtagthrone t1_ix4o7a5 wrote

If someone raises a valid concern with consequences both immediate and potential, like the fact that we are on a path to climate obliteration, and your response is that maybe AI will fix it one day when the singularity happens, then I think you are the definition of ignorant.

1

look t1_ix4rzac wrote

At this point, climate change is a social/political issue, not really a technical problem in need of a super-genius new solution. If we ask the post-singularity AGI for help, its reply will start with “Jesus Christ, you dumb, fucking monkeys…”

1

cy13erpunk t1_ix4tv3b wrote

spending too much time worrying about the future or the past is robbing yourself of your presence of mind

these terms 'the singularity' are just hyperbole , we are in the moment right now , this is it , and tomorrow will be it as well , and the next day/week/month/year/etc ; obvs the dawn of AGI will be a big thing as well , but it will likely have already happened previously unknown/unrealized before or by the time its widely recognized

i dont think its bad/wrong to be optimistic about the future , and what's the alternative? to be miserable? i do think that we should be realists/realistic with our views ; so many of us are already living in a very real and very boring dystopia , that beast needs no further fuel/food

most normies are not going to be down with techno-predictions becuz the avg person knows little-to-nothing about the space ; even the experts in their fields say that any predictions further than a decade or two at this point are just silly , so many things are so disruptive and dynamic that its basically impossible to make accurate predictions [not that it was ever easy]

its hard to find ppl to have these kind of convos with , yep

but imho its more important to have convos about alan watts and stoicism for the majority of ppl , they can at least get something meaningful/useful that can make their lives immediately better , whereas dreaming about the amazing technological progress of tomorrow is wild/fun for us sci-fi/nerd types , but not everyone shares our minds eyes

1

TamasSimon t1_ix50np3 wrote

Can both not be true at the same time? yes, we will reach AGI and then the Singularity and yes the climate and ecological crises will tip over to being irreversibly bad (unless we change course)

You hit a nerve... this is the biggest question of our time.

I worry that it will be the case of "the future is already here, just not evenly distributed"...
Look at what happened with the Covid19 vaccine. We essentially had the formula in 2 days and it still took a year and a half to complete testing, get it through the regulators and distribute (to the richer part of the world). Death can be a very binary thing for those six million people who died meanwhile.

1

Brief_Telephone_5360 t1_ix52ld4 wrote

That’s like continuing to smoke cigarettes because cancer treatment is going to be great in 2050

1

not_into_that t1_ix54oyo wrote

You're assuming the singularity will be accessible to the useless eaters.

I'd argue it's already being suppressed.

1

amandatheperson t1_ix5kxhn wrote

Its great that you are staying positive! But it’s a bit like jumping off of a very high cliff and hoping you’ll have enough time to assemble are your clothing in to a parachute during the time off the fall. Sure, you might be able to, you might not be. But wouldn’t it be better to try and not jump off of the cliff while you still have the chance?

1

Sad-Plan-7458 t1_ix64pma wrote

Whether it’s global warming or the impending threat of AGI. What’s scary is the Inbetween! Millennials and Z, and some extent X, will be a generation of In-betweeners. By this I mean, less employment, growing population and a society ill prepared for the social issues advancement will make. It will take many many years for a solid livable assured income. And in that time “between” there will unfortunately be a lot of pain and suffering that will in turn require a time of recovery. In the end, we continue on, reach for the stars (I mean space!! Fuck your dreams) and as humans pull off one of our greatest feats!! We forget, time is just a construct.

1

tatleoat t1_ix9ushr wrote

It is really hard to speak to people about the problems facing humanity and the world without turning it into a "how will the singularity fix this problem" conversation, but often times when the discussion at hand is one about those classically unsolvable problems like wage inequality where it's so unsolvable it's basically a fact of daily life, but if the conversation turns to solving it it's just impossible for me to imagine the solution being anything else

1

Iknowdumbshit t1_iy5gpxu wrote

Blxoom I think what you have explained concerning the inevitable mass extinction event that science claims is already underway and gaining momentum. Is while perhaps in some ways accurate and perhaps even probable, you are most likely correct concerning the more likely probability that we will soon reach our technological singularity and at that point our abilities and our possibilities will become virtually infinite and with the potential assistance of a friendly fully conscious and autonomous android/synthetic humanoids with a very similar neural system i.e. brain. With all the combined knowledge of the human race and the ability to analyze any problem in a matter of minutes and to create cost effective solutions that where in previous eras not even possible to fathom or contemplate solutions of such a nature. If mankind is wise and will choose to stop our endless cycles of unnecessary war and destruction, that have resulted in tens of millions of lives lost and have continuously and repeatedly driven the collective of man's consciousness into abject and total misery.

1

throwaway9728_ t1_ix37pa1 wrote

I think you're overestimating how much people take climate change in account when discussing the future. Not many people think long term about such issues and their higher-order effects. At best, they're parroting what the media says about resource wars and other issues (in the case of climate change) or "a future without jobs" (in the case of AI). It just isn't a prominent subject; this implies nothing about its relevance (or lack thereof). If you follow the subject and discuss things about it, you tend to think about things the average person ignores. Those who have the singularity in mind aren't special in that regard, as people who study or are interested in other fields (immunology, genetics, energy, etc.) tend to feel just the same about people ignoring the importance of present and future changes.

0

Cult_of_Chad t1_ix3bp5p wrote

Both climate dooming and singularitarianism are incel religions for men that struggle to thrive in the current world.

"The singularity will give me purpose and a robot girlfriend" is the other side of the coin from dudes hoping society collapses 'cause they're losers with no prospects that think they could be warlords in a Mad Max world. There's also the suicidal, misanthropic soyboys that want everyone to die too...

Pure escapism, all of it.

−3

TheHamsterSandwich t1_ix3e8z6 wrote

Collapse could happen. - The part that makes this a religion is assuming that we wouldn't recover, and everyone would die instantly.

Singularity could happen. - The part that makes this a religion is assuming that it would be beneficial to all, and everyone would live forever instantly.

​

sp-usrsurus-rsurus, surus.

2

PlayerREDvPlayerBLUE t1_ix5hf40 wrote

Hmm... Some people might have be considering such medical advancements like an implant that can help a person walk again? I hope that fits into your opinion.

1

Cold_Baseball_432 t1_ix2foxd wrote

I agree that if there were to be a tech that could save us, it would be the singularity, or something very close.

The problem is is that our remaining time is short, MUCH shorter than what’s spoken about in the media. Even if we were to get a superintellgience by, say, 2030, we might already be warmed to the point where a significant proportion of microbial life may not be able to survive, even if we go full speed at implementing any climate saving tech it were to produce. At least, not in time for the vast majority of us to survive. No microbes = no plants = no food = no air.

Also, the road to the singularity could be much longer than what we need to be able to save ourselves.

At the end of the day, what we call “AI” isn’t “intelligent” at all. They’re very accurate probability engines that operate at very low power compared to biological brains, not to mention anything of the fact that brains exhibit quantum qualities, bringing into question whether classical processors could EVER deliver the performance needed to come close, even if you were to weave together trillions of 0/1 transistors, as the current approach takes.

I read a RIKEN study about a month ago that tried to create a timeline for various full brain emulations that put the date for the primate (gorilla, IIRC) emulation after 2040 based on the current rate of semicon fabrication tech advancements.

I like your positive attitude, I agree that tech should always be considered, I too hope that we can save our planet. But I seriously wonder if we’re already out of time, with a few years of relative plenty left, to be followed by a rapid collapse.

−12

ChronoPsyche t1_ix2iirh wrote

I would love to know where you read that climate change is going to kill most of the microbial life on earth by 2030 and then we will all suffocate. I have never heard that prediction and it sounds dubious.

10

Professional-Song216 t1_ix2j9id wrote

If I remember correctly, much of microbial life is pretty resistant to temperature changes. I would also assume that if most microbial life could die from temperature changers due to climate change, all places on earth with seasons would be uninhabitable.

6

wordyplayer t1_ix2j4l0 wrote

rememeber it is saturday night and a lot of drinking is going on around here, ha

5

Cold_Baseball_432 t1_ix2lbtn wrote

Also, I should have chosen my wording more carefully- I said significant proportion of microbial life, when what I was thinking about specifically are the microbes critical to nutrient fixation and soil health. My apologies for having the dumb.

0

Cold_Baseball_432 t1_ix2ixzv wrote

The only paper I mention is the one estimating full brain emulation timelines.

What I wrote RE: microbes is a personal opinion/guess taking into account the fact that we’re warming much, much faster than the “official” projections, and pondering what the temperature increase tolerance of microbes critical to fix soil nutrients.

I would LOVE to hear what a microbiologist would have to say about this.

−5

ChronoPsyche t1_ix2j6ai wrote

I can assure you that not even the worst case predictions are that drastic. That's not to say that there aren't drastic worst case predictions out there, but none of them are predicting apocalypse. More like a world that is much less hospitable to humans (but still liveable). These impacts will be felt most extremely in developing countries, coastlines, and desert regions. But no, there won't be anything that deadly.

I'm no microbiologist, but I'm pretty sure that the amount of heat it takes to kill microbes would kill humans long before.

5

Cold_Baseball_432 t1_ix2je2h wrote

I know they’re not. That’s what keeps me awake a night. Say the world warms +2-3° C by 2030. What do yyou think the soil microbe survival rate will be?

1

ChronoPsyche t1_ix2jxsd wrote

So I did a quick Google search and it said that 140 degrees Fahrenheit (60 degrees Celsius) is necessary to kill soil microbes.

The hottest temperature ever recorded occurred in Death Valley at 134 degrees Fahrenheit in 1913.

So yeah, by 2030 maybe Death Valley will be reaching those temps but most of the world definitely won't. If they were, soil microbes would be the least of our worries.

5

Cold_Baseball_432 t1_ix2kxaa wrote

Interesting. 140f for how long? I imagine this is for soil disinfection over a relatively short period of time.

I wonder what happens when the earth “bakes” at a (slightly) higher temp for an extended period of time? Does it create a pasteurizing effect? If it doesn’t kill microbes outright, how much could the higher avg temp affect metabolism?

In the case of one-shot high temps like in Death Valley that you mentioned, I imagine the top layer of microbes could get cooked but I would expect there’s probably “replenishment from a microbial reservoir deeper in the soil.

Do constant, slightly higher temps have effects that penetrate deeper? And will they penetrate deep enough to significantly damage microbial reservoirs?

1

ChronoPsyche t1_ix2nlfo wrote

Those questions are beyond my quick Googling abilities lol. I think it's safe to say it's not a concern, though, as most places will not reach anywhere close to that temperature.

3