Submitted by [deleted] t3_10cd4by in singularity

Edit 2: very controversial. Up down up down up down. I love it, seems like interesting discussion is happening then.

Some people seem to look at transhumanism as some sort of ascension.

But really, I don't know what people expect. More status? More technical knowledge of the universe? VR simulations?

AI won't necessarily function in a way that actually leads anywhere that is ultimately worthwhile, and you could lose yourself in the process.

I actually sort of worry that a lot of people are out of touch with the genuine beauty of life, and that trying to fill that void with transhumanism is a little like people trying to fill that void with money and things

Edit: if you increase your IQ to 10,000, what makes you think that will make it easier to ignore a fundamental meaninglessness? Either you "ascend" in your understanding and face the empty reality you're already running from with obviousness every day, or you placate yourself with VR delusions. That's what the end is for the singularity, in my view. Everything you observe will be cheapened as it becomes more trivial with more and more intelligence. You'll solve problem after problem, but you might realize that there's no point to solving problems.

0

Comments

You must log in or register to comment.

AndromedaAnimated t1_j4f8hz1 wrote

Maybe the main reason is that we already have walked the first steps on the road, and saw that it makes some sense to alleviate our suffering and pain with new means.

Prosthetics and pharmacology are two fields of medicine that are pretty transhumanist, if we look closely.

19

[deleted] OP t1_j4fehm5 wrote

I'm saying it doesn't solve our existential problems.

It just puts bandaids on the wounds. It won't shield you from seeing a meaningless reality and deconstructing everyone you love on a daily basis, or finding yourself unable to ignore the illusory nature of self.

IQ does a lot of funny things when people actually have it at the top level.

−10

AndromedaAnimated t1_j4fi2bl wrote

I agree with the bandaid metaphor wholeheartedly. Of course I also still prefer having a bandaid protect my wound against infection.

The problem with the void inside our hearts is not that it cannot be filled in my opinion (as it can, every kind of wireheading temporarily fills it) but that our heart gets depleted again and again. It’s the way our brains and hearts function, and the reason for them to function like this is their connection to our bodies.

We evolved to feel pain and to suffer because we needed pain and suffering as indicators of danger/harm and unfulfilled need to survive.

Now we could see this as the meaning of life.

Or we could try and change this meaning, transcend it.

The latter is what transhumanism as philosophy attempts - at least how I understand it.

9

PandaCommando69 t1_j4f5l0y wrote

>you could lose yourself in the process

Yeah, we should just wait around to die instead. That's a brilliant idea and zero people have ever suggested it before.

16

[deleted] OP t1_j4f5wk9 wrote

One day you will die, and it's not clear that this is ultimately a bad thing in the long run.

−11

turnip_burrito t1_j4f64co wrote

Bad is subjective and highly personal. If people want to live longer and experience new things, and if this makes them a bit happier, why not?

13

[deleted] OP t1_j4f6dyn wrote

What if it comes at the expense of new lives coming into being?

−2

turnip_burrito t1_j4f6rgm wrote

Also highly subjective. Depends on who you ask.

6

photo_graphic_arts t1_j4f88va wrote

Do you think that the Baby Boomers were right to consume an incredible quantity of resources while emitting billions of tons of carbon into the atmosphere, putting earth on the path to catastrophic climate change? Was it their right to do so? Does it matter that Americans burning fossil fuels today are harming poor farmers in Africa, both now and in the future, due to global warming and its consequence? Is this also highly subjective?

If any of this connects for you, then you accept part of OP's argument about (hypothetical) people today choosing to live forever at the expense of others, who for one reason or the other, cannot.

1

turnip_burrito t1_j4f8ia9 wrote

The comment I was replying to was vague. I read it as "living at the expense of the ability of new lives to come into being", as in weighing the value of one person living forever against the value of adding one new person to the population count.

For my reply to the other interpretation of their words, harming future generations that will surely exist is similar to harming people that exist today.

6

photo_graphic_arts t1_j4f8v8s wrote

For sure - OP is stirring the pot, and though I agree with them generally, they could take a few minutes and make some more substantive arguments, imo. I added my own flavor by interpreting their comment in the most sensible way I could think of.

Also, I actually thought you were a different poster and would have taken a different tone if I had a do-over. Such is life. Maybe I'll get a do-over if the AGI allows!

3

turnip_burrito t1_j4f97wg wrote

I see! It's hard to say whether you'd be in luck if AGI discovers time travel, if the Terminator series is anything to go by. :p

2

175ParkAvenue t1_j4foi52 wrote

Every generation is consuming "an incredible amount of resources" compared to the previous one, and that is a good thing. We have not even scratched the surface of the available matter and energy in the reachable Universe.

0

Saerain t1_j4i7uem wrote

Unless existence is viewed as necessarily a state of increasing suffering, I don't see how that couldn't be a bad thing.

And I don't really ultimately care more about myselfβ€”I want people to not be forced to decay and die against their will, doesn't matter that much if it only becomes possible after my lifetime. Humanity deserves it.

1

Tyrannus_ignus t1_j601yrt wrote

that implies suffering is a bad thing, it is just a biological function.

1

[deleted] OP t1_j4f5yaj wrote

[deleted]

−4

[deleted] OP t1_j4f63cw wrote

I mean that universally for all of us, not as some slight to you.

1

[deleted] OP t1_j4f652c wrote

[deleted]

−5

[deleted] OP t1_j4f6blh wrote

Why? Instead of getting angry, why don't you explain your point of view more clearly?

5

turnip_burrito t1_j4f6ony wrote

People want more choice in their mind and body and surroundings. They believe this choice will make them happier than than they are now, because they can solve diseases, gain more understanding of their own life, and experience more of existence. Maybe it leads to more life satisfaction, maybe not. Maybe it also meets your definition of a meaningful life, and maybe not. But this is why people want it.

12

[deleted] OP t1_j4f6uq7 wrote

> experience more of existence.

Sometimes I think that experiencing more of existence at the top level is actually the opposite of experiencing existence more fully.

IDK. I'm hitting this weird phase in life where I feel like I'm ready to live for real. It's very hard to describe, and transhumanism just feels deep in my gut like some attempt to avoid both life and death, and live in a fantasy.

I just hope that this process is reversible for the transhumans' sake, because I suspect many of them (assuming this is a kind AI that honors peoples' free choices) will opt to live back in the real world without such connections.

1

turnip_burrito t1_j4f81nj wrote

People don't like the status quo. It's in our nature to want more of the things we like, and less of the things we dislike, and the status quo still has things many people don't like.

You can only see the genuine beauty of life and complexity of existence, and be in awe of it so long, before your cancer comes knocking and rips it away from you. Or your dementia makes you forget who you or your loved ones are. Your chronic fatigue and pain colors every day. Many people live their lives dissatisfied anyway, for many reasons, and die that way. I feel that transhumanism would be a net plus for these people.

People that chase money and things do it out of greed. Maybe anxiety of the future, and also social status. More, more, more. Is that meaningful? I would say it is to them, but maybe they may say they are never satisfied and I would believe them. It does hurt other people around them, so I oppose it on those grounds.

What is the real world? Hunter gatherer life? Caveman life? Primitive farmer? BC -2000? Medieval? Rennaisance? 1990s? People employ technology that solves many of their grievances very differently in each era, which could be seen as "cheating and avoiding reality" by people who live in earlier times.

10

[deleted] OP t1_j4fbmoa wrote

> I would believe them

I wouldn't.

> People employ technology that solves many of their grievances very differently in each era, which could be seen as "cheating and avoiding reality" by people who live in earlier times.

So, since drugs are a form of technology, would you then say that spending weeks on end strung out on heroin is living?

1

rixtil41 t1_j4ff0hq wrote

Would you rather be living now or be sent back to 1923?

4

[deleted] OP t1_j4ffbfa wrote

Maybe you'd like to live in 1982 if we had gone to nuclear war?

Not sure what your point is? People who are cautious about AI are Luddites, is that your point? So, Stephen Hawking was a Luddite?

1

rixtil41 t1_j4fgbuo wrote

My point is life was worse in the past. Things, in general, are better now than 100 years ago. There will always be risks when it comes to advancements but making things stay the same forever is not the solution.

5

[deleted] OP t1_j4g3xmp wrote

> Things, in general, are better now than 100 years ago.

There is no guarantee of this. It must be earned.

1

rixtil41 t1_j4hgli0 wrote

Not guaranteed but likely. Nothing in life in general is guaranteed things in general will likely be better in 2123. If your looking at "likely" in a negative view and it has to be perfect that's on you. You should be looking at the bigger picture and not have imperfection be the enemie of good. If you mean by earned that people should be going out of there own way for helping them selfs than yes. But if an opportunity arises so that they can help them selfs easier than it should not be invalid because it was made easier. If you still disagree that's fine but let's see if you still disagree in the year 2043. If it comes true, you will be in a world where everything in general is free, living in a simulation of your choice. If you still disagree if it comes ture then I'll be impressed but until then wait.

1

turnip_burrito t1_j4fhm2j wrote

Okay.

> I wouldn't.

You wouldn't believe if a person who constantly accrues wealth says they aren't satisfied, isn't satisfied? Odd.

> So, since drugs are a form of technology, would you then say that spending weeks on end strung out on heroin is living?

A very narrow form of living which also has extremely adverse, painful effects on the individual and others.

But I am sure you understand heroin is not the only form of technology, and not the way most people use it, and not what my focus is on. Are you here to have an honest discussion, or to be a bad faith contrarian?

3

[deleted] OP t1_j4g5h5f wrote

> You wouldn't believe if a person who constantly accrues wealth says they aren't satisfied, isn't satisfied? Odd.

Not by the constant wealth accumulation alone. These people usually have other people in their lives.

1

[deleted] OP t1_j4g5tpb wrote

> But I am sure you understand heroin is not the only form of technology, and not the way most people use it, and not what my focus is on. Are you here to have an honest discussion, or to be a bad faith contrarian?

The former.

My most common method of calling someone's idea into question is to simply create an idea that creates a contradiction with their own idea or logical form. This is actually an extremely effective reasoning method.

https://en.wikipedia.org/wiki/Reductio_ad_absurdum

It's one of my favorites, but it is a pain in the ass to people I am discussing with, so sorry in advance!

I would point out that the contradiction doesn't make the idea instantly wholly wrong, but it usually just illustrates a sort of incompleteness and need for refinement. :)

1

[deleted] OP t1_j4g7hy8 wrote

I want to add in addition... my real goal is to get people to think more complexly and get out of their boxes, as boxes just damage reason in the first place.

1

cloudrunner69 t1_j4f88bw wrote

I don't get why tranhumanism is some niche subcultuire. The fusing of human and machine seems like it will be a natural process that sweeps throughout all life whether they want it or not. If anything all I see tranhumanists as are people who are just more aware than others of this incoming tidal wave of cyberization.

11

[deleted] OP t1_j4fbjf7 wrote

That sounds horrific, not like a natural process at all. Rapid changes tend not to be so smooth. Well, natural like an extinction.

That scenario is actually one we must prevent at all costs.

−11

SoulGuardian55 t1_j4fkvjo wrote

Throughout history, mankind always seeks for means to enhance itself, overcome it's limits. By cybernetic implants, biotech, nanotech and everything in between are paths towards those goals.

4

photo_graphic_arts t1_j4f7il0 wrote

I don't think transhumanism can make life more meaningful for someone whose life was not meaningful before. They're not related concepts except as transhumanism pertains to extending one's lifespan, which could give someone already living a meaningful life more time to accomplish the things that make their life feel worthwhile.

Transhumanism as a means of extending the lifespan of someone who lives only for themself seems like a trivial idea. If it cost nothing, then why not, but if it's at the expense of others, then it seems foolish to me.

Yes, every few months we have an article about how senescence isn't natural or unavoidable, but having said that, OP makes good points; money and things and time spent watching TV/playing video games don't add up to a meaningful life for most people, regardless of how long they live (or the realm, physical or digital, in which they exist).

10

NTIASAAHMLGTTUD t1_j4fbxh7 wrote

Hey, scrolled through the thread and appreciate your civility, this a valid question imo.

I won't say this is the 'definitive' answer, but this is what I believe.

>AI won't necessarily function in a way that actually leads anywhere that is ultimately worthwhile, and you could lose yourself in the process.

True, I mean as much I hope for this technology to be used to improve people's lives nothing in the future is 100% certain. I actually think you can 'lose yourself' through transhumanism if you become a completely different being, like if my IQ increased by 10,000 I would have little relation to the person I am now. I'd be something else.
>I actually sort of worry that a lot of people are out of touch with the genuine beauty of life, and that trying to fill that void with transhumanism is a little like people trying to fill that void with money and things

Life does have beauty, but not for everyone in equal order. There are tons of people suffering every day for really no reason or purpose. I mean think about a young child passing away from a painful disease, there's no beauty in that, it's just a sad fact of life that these things happen. I think some degree of suffering and hardship is necessary for growth, but a lot of stuff is just a bad draw, excessive and pointless. This is what I hope technology can help solve.

I want to clarify that although I hope the singularity happens and that it is positive, i don't consider anything a sure bet. I also want to point out that people through out history often had some form of afterlife/compensation that contextualized and enriched their time on earth, personally I sympathize with this feeling. There's a lot of the 'void' for people here.

8

[deleted] OP t1_j4fdg6y wrote

> There's a lot of the 'void' for people here.

Indeed.... And I've been there.

I'm more pushing at people who are trying to fill the void with this tech. But, if you just increase your IQ to 10,000, I think people will just hit a point where they have a hard time ignoring the void. I.e. it'll just make it blacker and more obvious, or people will just placate themselves with virtual worlds - quite the opposite of any "ascension."

It doesn't solve the problem.

2

NTIASAAHMLGTTUD t1_j4fezgj wrote

> I think people will just hit a point where they have a hard time ignoring the void. I.e. it'll just make it blacker and more obvious, or people will just placate themselves with virtual worlds - quite the opposite of any "ascension."

I ultimately think that the feeling of happiness is a result of bodily physical processes, with no supernatural/spiritual intervention (note: this is different from the feeling of spirituality itself, which is very important for most people to feel imo) An advanced technology can help with that, both in ways we may consider positive or dystopian (see: wire-heading, where people are perpetually on a drug high that never comes down). Whether that tech will come into being and be used in a constructive way is the question. But I believe, at a min, a lot of suffering is unnecessary, unproductive, and arbitrary, and eliminating it would be desirable if we could.

6

[deleted] OP t1_j4ffdbn wrote

What's your theory on consciousness? The mind-body problem - it is not a solved problem, not even within the realm of materialism.

3

NTIASAAHMLGTTUD t1_j4fg8sy wrote

I believe consciousness & personality is derived from the brain/body. Someone gets brain damage, their personality changes. Destroy their brain, they have no consciousness at all. The only workaround is if you believe in something like a soul, which I haven't seen any evidence of.

7

[deleted] OP t1_j4g5cvd wrote

> they have no consciousness at all.

Heh but which form of materialism do you prefer?

https://plato.stanford.edu/entries/consciousness/#PhyThe

> which I haven't seen any evidence of.

I think the empiricist's fallacy is to think that something cannot exist simply because there is not evidence. I view this as a sort of "equal and opposite" fallacy to the idea that one can make a positive assertion without evidence of any sort (https://en.wikipedia.org/wiki/Russell%27s_teapot).

1

NTIASAAHMLGTTUD t1_j4gjlvc wrote

I'll ask you this, if a person is thrown into a volcano and their body is completely destroyed, where does there consciousness go and how does it get there? Explain theoretically how that would work.

>I think the empiricist's fallacy is to think that something cannot exist simply because there is not evidence.

'Cannot' is a strong word, and I although I agree in a very select few cases, i find it be mostly rubbish. If someone wants to prove that something exist, usually they try to gather whatever evidence they can to push it forward, they don't say 'well, technically you can't be 100% sure my Ferrari doesn't exist, it could be invisible & only seen by me & not measurable in any way, I mean there could be evidence that supports my belief that you just aren't seeing and can't be currently tested because of x/y/z"(then what currently leads you to think it's plausible?) It seems wormy and slippery. Is it a fallacy on my part to say vampires and hobbits cannot exist because there is no evidence?

I'll not trying to 'getcha' but I'm genuinely curious, do you believe in a soul, an afterlife for that soul, and God? If so, what makes you believe that?

5

rixtil41 t1_j4fd9u0 wrote

I have a better question do you have a problem with transhumanism?

8

[deleted] OP t1_j4fdcbl wrote

Not necessarily.

I'm more poking at the people who think it's going to transcend life's problems in itself.

I think it'll just make the void they feel now more obvious because they'll be too smart to ignore it.

0

rixtil41 t1_j4fdqie wrote

Do you mean like living in a simulated world where you get to do what you want? Or living longer?

6

rixtil41 t1_j4fe9c7 wrote

I don't see a problem with people living in virtual worlds

6

[deleted] OP t1_j4fe897 wrote

I think Wittgenstein had a funny comment about religion, kind of remarking about how enabling life after death doesn't actually solve the problem.

2

PhilosophusFuturum t1_j4fbmzi wrote

OP’s username is OldWorldRevival, and he advocates for technological regression and an academic reintroduction of theism. So I assume that the angle here is that Transhumanists are attempting to use Transhumanism as a stand-in for religious fulfillment. Edit: His account was also first made to complain about AI art and he made a sub protesting it. I assume his resistance to AI-art is what attracted him to resist Singulatarianism and Transhumanism.

For some that could possibly be true. But the idea of Transhumanism-as-religion is fundamentally flawed. Transhumanism and religion might share a lot of similar ideas like immortality and creating the best possible existence. But that’s where the similarities end. Religions make metaphysical claims like the existence of gods, creation of the earth, etc. transhumanism makes none of these claims because it is an intellectual school of philosophy; not a religion.

As to why people follow Transhumanism, most Transhumanists are very staunch Humanists, Futurists, and Longtermists. Transhumanists see the vague concept of β€œtechnological development” as a way to achieve things like superintelligence, omnipotence, immortality, and supereudaimonia.

As for β€œthe beauty of life”, most Transhumanists tend to be existentialist and cosmist. Many Transhumanists believe in the beauty of the existential nature of Humanity to achieve great heights, and our very specific place in the history of Humanity. As a result, Transhumanists often have a strong fascination with things that most people overlook like everyday scientific progress; while ignoring β€œdistractions” like elected politicians.

6

[deleted] OP t1_j4fcz8f wrote

> OP’s username is OldWorldRevival, and he advocates for technological regression and an academic reintroduction of theism. So I assume that the angle here is that Transhumanists are attempting to use Transhumanism as a stand-in for religious fulfillment.

Start out with a strawman, eh? You think you understand my username?

What if you're the technological regressive? What if you'd create the tools haphazardly that enable the destruction that prevents the greatest of tools from ever being made?

Maybe you think we should have just felt the intense heat of progress of thermonuclear weapons?

Stephen Hawking warned about AI - was he a Luddite, in your view?

1

PhilosophusFuturum t1_j4fdpf8 wrote

I assume by that you’re referencing the idea that we might accidentally create a tool that could destroy civilization. Transhumanists care deeply about preventing that; many of the researchers working on the Control Problem are Transhumanists.

The Control Problem (aka the alignment problem) is the problem of making sure a superintelligent AI is aligned to Human interests.

If AGI is to eventually happen (which most Transhumanists believe it will), then it’s imperative we solve the Control Problem instead of trying to prevent the development of AGI. In this framing, it’s Transhumanists who are engaging in the reality of the danger whereas everyone else is playing with fire by ignoring it.

1

[deleted] OP t1_j4fe2xd wrote

I don't think that's the right framing of the problem.

Transhumanism carries its own innate risks and is not a real solution to the control problem on a practical level, in my view.

1

PhilosophusFuturum t1_j4fep0j wrote

From their worldview of an inevitable singularity it makes perfect sense. If we cannot stop AGI; we need to find a way to align it to our interests. It’s the practical approach. As to why Transhumanists often believe AGI to be inevitable:

-Game Theory: Many countries like the US, China, UK, India, Israel, Japan, etc., are all working on researching Machine Learning. And an AGI is absolutely crucial to national security. Therefore a ban on ML research is entirely unrealistic. And since every country understands that such a ban won’t work, they would all continue to research ML even if there was an international ban on it.

-The inevitability of progress: Transhumanists often believe in AI-eventualism, or the idea that Humanity is on the inevitable path to creating ASI, and we can only slow down or accelerate that path.

-The upward trajectory of progress: Building on the last point; most Transhumanists believe that technological progress only ever increases, and that any attempt to stop a society changing innovation permanently has entirely failed and will always fail. So focusing in adapting to the new reality of progress is better than resisting it, which has a 100% fail rate.

4

[deleted] OP t1_j4ff8ay wrote

> -Game Theory: Many countries like the US, China, UK, India, Israel, Japan, etc., are all working on researching Machine Learning. And an AGI is absolutely crucial to national security. Therefore a ban on ML research is entirely unrealistic. And since every country understands that such a ban won’t work, they would all continue to research ML even if there was an international ban on it.

Glad you bring game theory into this, because this is why I do not view transhumanism as a solution. Heh. My opinion on this topic is a little bit esoteric even around these parts.

Basically... Pareto principle is why we have wealth distributions that are hugely unequal, but even more unequal at the top. It's why we have kings. It's why India and China are way more populous than the rest of the world.

This is also what will happen with AI agents. Certain AI or human agents will have significantly more control, and that control will snowball. We already see this with giant tech companies now dominating the landscape.

If an AI is not a world dominating AI, then it by nature cannot suppress other AI from being created to surpass it, and another will surpass it. If it is ASI, it has the power to dominate the world, whether or not it wields that power to do so.

It'll be significantly more stratified, not less. Basically, transhumanism is like putting your mind at the whim of whatever AI is at the top of this hierarchy, whether orn ot that AI does something.

1

PhilosophusFuturum t1_j4fg510 wrote

In theory the growth of the ivory tower that the elites are on should rapidly outpace that of the peasants because they hold the ever-expanding means of power. But the one asset the elites have that is truly ever accelerating passed the peasants is their wealth, not technology. In fact, technology is the great equalizer.

For example, your average middle-class person in the developed world today has a higher QoL than a king in the Middle Ages, and that’s entirely thanks to technology. Likewise, the QoL gap between a modern middle-class person and an oligarch is smaller than that of a medieval peasant and medieval king, despite the lifestyle of a modern oligarch being so much more lavish than that of a medieval king.

This also applies to offensive technology. For example, Europe was able to take over all of Africa despite the invaders being a small army compared to the imperialists tribes of Africa. That’s because they had guns. And when Africans got guns; they were able to push the Europeans out. The only African country that avoided colonization was Ethiopia, and it’s because they convinced the UK and Italy to give them guns. This is because guns rapidly closed the technology gap, even if the guns of the invaders were still very superior.

The same logic applies to ASIs. Sure there may be an ASI that is so great that no ASI could surpass it, but it doesn’t mean lesser ASIs can’t be created that could potentially kill it.

On that note, I am a lot more concerned about civilizational destablemeant than than I am of super-authoritarianism. With increasingly better tools, people could easily create dangerous ASIs and super viruses that huge governmental institutions might not be able to contend with.

3

GeneralZain t1_j4fdqej wrote

I just want to be free of my biological limitations :)

both in my physical form (which I will change asap!) and cognitive ability.

5

sumane12 t1_j4fjweg wrote

Ouch that's nihilistic πŸ˜‚

So I think this is a very deep philosophical question but I'll try my best to understand it, tho I might not answer the question.

When we look back through history, we see a lot of wasted potential in terms of people being stuck in a life of servitude, not able to explore beyond their own town, having no knowledge of what may be in store for humanity. That's not to say they were unable to appreciate the world around them but that appreciation was limited by their circumstances.

The singularity, and by extension seeks to attempt to remove those limitations, one step at a time, so for example, our biggest concern at the moment is scarcity of energy and natural resources, this is very much a problem that can be solved in the near future, which would allow people to spend more time appreciating life, enjoying the love of their family and friends, spend more time exploring the planet and the beauty it offers. The next stage to this would be mind uploading, gradual replacement of biological neurons with artificial nano bot neurons. This will allow countless possibilities, apart from being able to go anywhere traveling at the speed of light, we can upload our consciousness into robots that are accurate replicas of our human bodies, these robots could be placed on any planet in the solar system and potentially beyond. And then obviously you alluded to this, but experiencing artificial worlds.

Now let's extrapolate a trillion trillion years into the future. We have managed to break the speed of light speed limit by warping space time, and have explored the entire universe, we have become a collective hive mind and collectively know everything about everything and have experienced every possible reality we can simulate, the last black whole is about to evaporate due to hawking radiation and the end of the universe as well as all life is imminent. Does this mean it's all been pointless? Should we not be interested in the expansion of abilities and the transcendence of our current situation because we know eventually it will end?

I don't think so, in fact I'd argue that our life is more meaningful than those in the past who were destined to live their life forever in their own little village, and never experienced anything new. So too I think the lives of transhumanity will be more meaningful than our own. Another analogy is our own life, ultimately we will die, this doesn't make our life now meaningless or we should just kill ourselves. Instead we recognise how beautiful our life is and embrace it because ultimately everything is temporary.

I hope that answers the question atleast from my perspective.

5

[deleted] OP t1_j4g741o wrote

> Does this mean it's all been pointless?

Heh. When you're IQ is 10 million, it might be impossible to ignore that pointlessness.

> I don't think so, in fact I'd argue that our life is more meaningful than those in the past who were destined to live their life forever in their own little village, and never experienced anything new.

Heh. Those people are all our ancestors though too, and as such, it can be quite nice to feel a real sense of gratitude to those people. To me, that is the meaning, more than what I experience here and now with my limited body and lifespan.

I love The Consolation of Philosophy by Boethius for an illustration of this. He was a 4th-5th century philosopher whom Bertrand Russell remarked "He would have been remarkable in any age, in the age in which he lived, he is utterly amazing." Worth a read, to say the least. The life story of Boethius is one of tragedy but that tragedy produced his greatest work that had tremendous impact on the history of civilization, even though its so obscure today.

Also... I think the AI is either going to be hyper nihilistic or find God. I don't see any other outcome really. Everything between those two I find to actually be incredibly irrational. I.e. a lot of existentialism to me has struck me as equally delusional as any accusation of religious delusion, but where what you get at the end of the day is considerably reduced. Part of me is like "either just accept nihilism, or be religious, the in between is like being religious without the benefits of religion... perhaps dumber than just being religious!"

Lmao... Full disclosure I am a former nihilist and technically agnostic believer. Though that agnosticism just seems like a technical trifle at this point.

1

sumane12 t1_j4gg8q5 wrote

>Full disclosure I am a former nihilist and technically agnostic believer. Though that agnosticism just seems like a technical trifle at this point.

Yeah I kinda got that energy lol. I mean I'm all for searching for a supernatural creator, but however far you go up those levels, you still face the point of nihilism. Also any search for god through religion, needs to recognise that atleast 99% of religions are wrong and created solely to exploit the lower ranks of the religion and to create societal behaviour modification.

I think meaning is whatever we make of it, that's the beauty of conscious subjectivity. How do we know that someone else doesn't have a completely different subjective experience than we do? And I think that question is beautiful.

I think if you can't generate meaning from your life today, how will finding god change that? Perhaps you are looking for a purpose greater than yourself? I don't know I'm not a psychologist, but I would also say that just because you don't see meaning somewhere, doesn't mean others don't, and it can be worthwhile dedicating yourself in service to those less fortunate?

3

[deleted] OP t1_j4git9x wrote

> I mean I'm all for searching for a supernatural creator, but however far you go up those levels, you still face the point of nihilism.

Well, when you experience it for yourself, it ends up being very different. It's like that beauty that you intuit in the universe that you're trying to relate to me, but imagine that times a million with a sense of love that is so profound and unconditional that you truly understand that this love is what is important.

It's more like life is about spirituality, rather than spirituality is a cure for nihilism or death.

> Also any search for god through religion, needs to recognise that atleast 99% of religions are wrong and created solely to exploit the lower ranks of the religion and to create societal behaviour modification.

Heh. All religions are wrong and all religions are at least a little bit right. For illustration -discovering relativity didn't make Newtonian gravity wrong. It just showed that there was more to it, and that it wasn't as absolutely correct as we had initially imagined.

1

sumane12 t1_j4gkqva wrote

All seems reasonable. I hope you find what you're searching for.

1

[deleted] OP t1_j4f6xjm wrote

A point to humanity being allowed to have infinite expansion, with a perspective that sees our ultimate fundamental limitations and the game theory of our existence, and seeks to find a method of transcending them.

4

Current_Side_4024 t1_j4fuza4 wrote

The natural human brain is an evolutionary product, it’s not designed to experience that much beauty. We want life to be beautiful, evolution only wants it to be adequate. That’s why we want transhumanismβ€”to enhance the beauty of life that we already experience. Because nature doesn’t give us enough

3

[deleted] OP t1_j4g3kqq wrote

> Because nature doesn’t give us enough

Heh. I've experienced so much beauty at times that this actually made me chuckle.

1

Current_Side_4024 t1_j4g47ow wrote

At times. Transhumanism is about making things beautiful all the time

3

[deleted] OP t1_j4g7sex wrote

> At times. Transhumanism is about making things beautiful all the time

Heh. Need a functioning philosophy of beauty in the first place to figure out what you're aiming at!

I suspect that the nature of intelligence, even superintelligence, is extremely diverse, in that initial conditions could lead to many very different ideas and outcomes.

This is why its imperative that we actually get it right societally. So, in this enthusiast community I hope to create some thought-provoking questions that push the needle just a tiny bit father towards being likely to be a good outcome rather than a bad one! Heh.

1

atchijov t1_j4f9i8o wrote

Same as any religion… people are looking for something or someone to solve all they problems. No efforts on they part… and than at some point in the future everything gets sorted out auto-magically.

2

photo_graphic_arts t1_j4fa0n8 wrote

Always has been, and some modern religions teach how foolish this is.

Citing tm.org, which is dedicated to transcendental meditation:

Jesus was once asked when the kingdom of God would come. The kingdom of God, Jesus replied, is not something people will be able to see and point to. Then came these striking words: β€œNeither shall they say, Lo here! or, lo there! for, behold, the kingdom of God is within you.” (Luke 17:21)

2

turnip_burrito t1_j4fancb wrote

Yep, the progress toward life satisfaction, even when all else is resolved, is an internal process of the mind.

3

TerrryBuckhart t1_j4fa0rj wrote

It’s Fear of the unknown that drives them. As humans our arrogance leads us to believe that one day we can become immortal Gods.

2

Scarlet_pot2 t1_j4fkf0r wrote

Probably the void of not having the perfect body.. not as beautiful as could be, not as strong or as smart or could be.. and don't forget health. People don't like knowing their body is getting older and deteriorating. the medical visits remind them.

2

[deleted] OP t1_j4g7e51 wrote

Heh.

What if I turned you into a P-zombie but made your body more attractive and mind hyperintelligent?

1

Scarlet_pot2 t1_j4jhc0z wrote

I'd use my own AI to improve myself, not allow someone else to do it for me. That alone can save future people from being corrupted.

Take the open source version, learn how it works, tailor it. That would probably be the safe way to do things, compared to downloading a pre made one from a trillion dollar capitalist corp.

Once AGI is developed, and understood I doubt it will be any harder to learn then today's AI methods and math

1

DreamsOfCyber t1_j4fkj8a wrote

Because as of now I can only view reality from one perspective, one mind and one personality, I have the same looks and so forth... that's incredibly limiting no matter who you are.

2

No_Ninja3309_NoNoYes t1_j4fks3t wrote

I am not sure if you are exaggerating or don't understand IQ. But anyway to answer your question, there are ultimate questions everyone has struggled with and the answers are not satisfactory for many people. Personally, I would be happy if the Internet was just a tad better, decentralised, free, and safe. This could act as a hive mind. High IQ might be fun, but will it answer the tough questions?

You can argue that society has developed from hunter gatherers to agricultural to high technology with the goal of answering these questions. And maybe that means building a Deep Thought computer. Maybe it means that our hive mind of biological beings will have to find the answers. Maybe both.

2

Stippes t1_j4fmhfg wrote

Modernism in the philosophical sense has shown to be not accurate.

We, as humans, are neither the owners of our own developments nor our own decisions.

Transhumanism, in my eyes, is a movement to leave these beliefs behind.

2

gay_manta_ray t1_j4fo7bg wrote

in a general sense, transhumanism and the progress that agi would bring with it would probably mean a vast improvement in material conditions. poor material conditions are at the core of almost all human suffering. more specifically, transhumanism has what appears to be limitless possibilities. we don't even know where the ceiling is, so we have no idea where it might take us. personally, i enjoy novel experiences, and increasing novelty is almost always a positive outcome. if you want to stagnate, get sick, grow old, die, etc, that's entirely your choice. not everyone wants that for themselves. OP, read some fucking sci-fi.

2

PhysicalChange100 t1_j4gcaau wrote

u/OldWorldRevival I am disappointed by this post.

Nihilism:

Meaning itself is subjective, someone having a dinner with he's or her family is meaningful itself, You pushing your own subjective narrative that the universe is meaningless is quite authoritarian and dismissive of other people's meaning. And I hate when other's do that.

Transhumanism:

To be honest with you, as a teenager I saw transhumanism as a way for me to become stronger and smarter, it's basically an ego thing, but now as an adult, I see transhumanism as a way to liberate humanity from unnecessary suffering and unplanned death.

Stupidity and ignorance can cause unecessary suffering and therefore we should amplify our intelligence. But it's not just about survival and removing unecessary suffering, it's also about creating things beyond our wildest imaginations.

To revive our inner child's desire for wonder. To create an environment which all conscious beings are given a fair chance to truly live an amazing and great lives. Beyond diseases, beyond accidental deaths, beyond petty violence, beyond greed and corruption and evil.

VR:

Art is both philosophy and pleasure merging into a complex form of sensory and intellectual information.

When's the last time you watched a movie or played a single player video game and told yourself "what a masterpiece".

VR simulations will break the boundaries of art, and reality, and merge into something extraordinary and beautiful. A wealth of information as well a time for reflection in both reality and art.

VR is not as empty and void as you think it is.

My observation of you:

To be honest with you, you strike me as a "logic bro", You dismiss art, you dismiss social relationships, you dismiss emotions, and your inner child have long been suffocated by your impulsive need to reduce everything into predictable narrow categories and simplistic logical assumptions. You hate emotional reflections and you would rather dismiss your emotions rather than actually analyze, confront and entertain them so you shove them all down in favor of logic. Edit: which by the way, Causes, lack of empathy, lack of imagination, lack of creativity, lack of excitement, lack of hope and produces a narrow mindedness which in turn creates depressive mental traps like your own post.

I've been that logic bro before and it's not fun. It doesn't help anyone not even yourself. It's the death of the inner child and it's the emergence of nihilistic thinking.

Logic is not a tool to repress our humanity. Logic is a tool to amplify Us.

Conclusion:

Be the agent of liberation and inspiration and not repression and discouragement.

2

[deleted] OP t1_j4gdfku wrote

> When's the last time you watched a movie or played a single player video game and told yourself "what a masterpiece".

Pretty well every day, actually. Heh.

> To be honest with you, you strike me as a "logic bro", You dismiss art, you dismiss social relationships, you dismiss emotions, and your inner child have long been suffocated by your impulsive need to reduce everything into predictable narrow categories and simplistic logical assumptions. You hate emotional reflections and you would rather dismiss your emotions rather than actually analyze, confront and entertain them so you shove them all down in favor of logic.

Even though this is no longer who I am, my post reads this way because I used to be a very deep nihilist, and I can still draw from that place even though I no longer truly occupy it.

> VR simulations will break the boundaries of art, and reality, and merge into something extraordinary and beautiful.

What if I told you that there is already profound, extraordinary beauty all around you? Heh.

> Be the agent of liberation and inspiration instead of repression and discouragement.

I feel more like I am making a tough pronouncement to those who pursue this technology religiously and expect to find something religious in something nonreligious.

1

PhysicalChange100 t1_j4goxfo wrote

>Even though this is no longer who I am, my post reads this way because I used to be a very deep nihilist, and I can still draw from that place even though I no longer truly occupy it.

Okay and? Does it help anyone?

It doesn't.

>What if I told you that there is already profound, extraordinary beauty all around you? Heh.

True but it's not enough really, it's a hedonic thread Mill thing, if you lived in nature most of your life, you get desensitized by it's beauty.

Plus there's no way to replicate rdr2 in real life unless you want prison time. A good balance of Artistic experience and real life appreciation is needed.

>I feel more like I am making a tough pronouncement to those who pursue this technology religiously and expect to find something religious in something nonreligious.

And why does it matter? There's actual religions out there that are harming people. But here you are, shooting other people's hopes down for a better future.

It's an excuse to be an asshole and you know it.

1

[deleted] OP t1_j4gz0on wrote

> It's an excuse to be an asshole and you know it.

This has more to do with you than me.

It feels weird being the bubble burster as the believer... lmao. Just a weird reversal of roles, just noticing the observation, when does that ever happen? Lol.

1

PhysicalChange100 t1_j4h56mh wrote

Oh so that's what you call yourself, a "bubble burster".

Edge lord moment.

1

[deleted] OP t1_j4hkrj7 wrote

Hey, you said I was the one bursting the bubble!

It was just weird to be on the other side of that since a lot of people tend to claim that they're the bubble burster with me.

1

EddgeLord666 t1_j4gxggh wrote

For me my purpose is living as long as I can or until I get bored, suffering as little as I can manage and having as many experiences that are new and enjoyable as possible, and shaping myself into what I perceive as the most ideal version of me. All those things would be furthered by transhumanism. If you perceive your life to have a different meaning that’s ok but these are the things that motivate me.

2

Information1324 t1_j4fvzxi wrote

The singularity isn’t about expansion of intelligence; for me, it’s about expansion of mind. Also understanding and thinking about what is possible to do and become in the universe doesn’t invalidate or cheapen our current experience or our past experiences, and I disagree with the idea that interest in the technological singularity correlates with dissatisfaction with current life.

It is clear that meaningfulness is a possible state of mind to obtain and I don’t see why that would change. I also disagree with the widespread misconception that with more intelligence comes a greater rΓ©alisation of the meaninglessness of life. I don’t think intelligence has anything to do with feelings of meaning.

1

innovate_rye t1_j4h8yub wrote

i think the point of the universe is to experience every possible experience. you can anthropomorphize those experiences to rate a "good" or "bad" experience but the whole point is to experience everything.

1

onyxengine t1_j4hagtl wrote

There’s no void to fill it’s really just a philosophy that embraces the potentiality of technology to augment human form and society. If there is a void that people are looking for transhumanism to fill its the void in our life spans. I could easily do 400 years given how much rapid and radical change we are likely To see. It would be amazing to watch us build the first underwater cities and live in one, or live on a off planet colony. Or even contribute to building them.

1

someone4eva t1_j4hcnkf wrote

Agree with your post.fir me current reality is just not very satisfying so a chance at a giant shift is well worth it. And yes ultimately I'm hoping we can solve love and meaning....otherwise we will still be miserable

1

Lord_Thanos t1_j4hdmom wrote

I don't like that the current way of one's life is entirely based on chance. Your intelligence is genetic, your looks are genetic, your personality is genetic in the fact that you will trend towards some personality(introvert/extrovert). Too many chains. It would be great to be able to have control over these things. Really the main one being intelligence.

>But really, I don't know what people expect. More status? More technical knowledge of the universe? VR simulations?

Status? That's a silly idea. More technical knowledge would be incredible.

>AI won't necessarily function in a way that actually leads anywhere that is ultimately worthwhile, and you could lose yourself in the process.

I don't know what you mean by this. It's a very vague and philosophy babble like statement. How do you know how AI will function? Lose myself in the process? What process?

>I actually sort of worry that a lot of people are out of touch with the genuine beauty of life, and that trying to fill that void with transhumanism is a little like people trying to fill that void with money and things

Another philosophy babblie like statement. What is the "geniune beauty of life"? What is the process for designating which things are beautiful and which are not? What is it to be "in touch" with the "geniune beauty of life" and how does it differ from being "out of touch" with the "geniune beauty of life"?

>if you increase your IQ to 10,000, what makes you think that will make it easier to ignore a fundamental meaninglessness?

I don't have a hard time doing it now. You just have to keep your mind occupied. Or you could simply rewrite the way your brain perceives these ideas. Make them a non issue.

>Either you "ascend" in your understanding and face the empty reality you're already running from with obviousness every day, or you placate yourself with VR delusions.

More philosophy babble.

>Everything you observe will be cheapened as it becomes more trivial with more and more intelligence. You'll solve problem after problem, but you might realize that there's no point to solving problems.

I don't see how what I observe can be "trivial". That's an odd adjective. I think you meant "every problem". Just because it's trivial does not make it cheapened. The point would be to gain a greater understanding. More knowledge. Of course I can only speculate. And there's no point to anything, yet we still do things.

​

Really this is all speculation and very general. I cannot know how some ai with millions of time more intelligence will perceive the wolrd. They won't even have the biological makeup of humans so I guess you can say for sure it will be vastly different.

1

curloperator t1_j4hsjab wrote

This is an incredibly condescending post. OP's entire flawed premise is based on thier assumption that they already know that everything is meaningless and people are empty or whatever edgelord nihilistic nonsense they are confusing with wisdom and enlightenment. This whole thread isn't really even about transhumanism, it's about the OP using transhumanists as a target to feel smug. Downvote and ignore this bullshit.

1

br0kenhack t1_j4i1c52 wrote

Smart phones are a primitive step towards transhumanism, they’re an external extension of ourself. They’re very powerful if used in a productive way and as most would probably agree they’re also detrimental in other ways. We’re definitely going to have some rocky adjustment periods, perhaps even a crisis. In this capitalist age the void that is mostly filled is increased productivity (so not really a void as much as an enhancement) and entertainment / social voids. For the future hopefully there are more constructive voids being filled, or similar voids being filled more constructively.

1

alexiuss t1_j4j0fao wrote

What void? Transhumanism is improving your life with tools (Ais).

What void are you filling when you use a power drill to make holes in a wall instead of say using a sharp stick to do it? What void are you filling when you drive a car instead of walking? What void do you fill when you draw with a pencil instead of your fingers?

This question implies an assumption that there's some sort of void whereas in reality people just want to have nicer, easier lives with less: misery, suffering, pain and death.

Your other point is just as vapid of reason and rationality.

Even if my IQ was 10k I would still like the same things - creating worlds. I would just create more complex worlds if I had more IQ.

Problem solving might have a hard limit, building and enjoying virtual and physical worlds does not.

1

[deleted] OP t1_j4j0unk wrote

Well, perhaps you lack a sense of a void and are therefore not who this is pointed at! Lol.

1

alexiuss t1_j4j1y93 wrote

AIs are already functioning better than expected. I'm drawing ten times as much with my ai partner.

:βˆ†

1

Ortus14 t1_j4j86gv wrote

Q: What void are people trying to fill with transhumanism?

A: Understanding where things are headed helps you make the best decisions for the best quality of life. There's also beauty in the evolution of life and echo systems on earth.

Q: I actually sort of worry that a lot of people are out of touch with the genuine beauty of life

A: People can have more than one interests. We can appreciate the beauty of a tree or rose, as well as appreciate the beauty of a large language model or a deep neural network learning to drive. In reality it's all one interconnected system.

Q: If you increase your IQ to 10,000, what makes you think that will make it easier to ignore a fundamental meaninglessness?

A: There is no "fundamental meaninglessness". Meaning is significance assigned inside of minds to conceptual representations of external phenomenon.

1

Cult_of_Chad t1_j4jwlvz wrote

I want to grow strong. I want to survive. I want to have many children and to see them spread across every corner of the universe they can reach before we're all snuffed out by entropy or chance. Basically what any living organism wants.

1

[deleted] OP t1_j4k49p8 wrote

Precisely.

And what makes you think it'll be you, if that's the moral system that we're operating on?

1

Cult_of_Chad t1_j4k4tfq wrote

There's space for all of us. We might need to eat each other before the very end, but we're barely into the stellar age. I want to see as much of it as I can.

1

[deleted] OP t1_j4k5g2c wrote

> There's space for all of us.

Of course there is, but Genghis Khan didn't do what he did for a lack of space.

1

Cult_of_Chad t1_j4k6bi2 wrote

>Genghis Khan didn't do what he did for a lack of space.

Don't pretend to know his motivations. Take a look at 'great' people today, the ones that get written about. Is Putin Genghis Khan? He's futilely fighting for the future of his crumbling empire and dying out people yet he's a monster. And a bit of a pathetic clown.

Also, that's where the grow strong part comes in. We must compete or die in the ultimate, technical sense. But less scarcity has translated into less violent competition so far. Llet's push that further and see how far it scales first, before giving up on all growth and life. (Not that I would anyway, it's axiomatic to my identity)

1

[deleted] OP t1_j4fa6oa wrote

[deleted]

0

[deleted] OP t1_j4fbcdd wrote

> What joy do you get out of telling people they're wrong about their beliefs?

Feels more like caution, and that these particular beliefs might pose some existential risks to society given the asymmetric nature of distributions of things like power, intelligence and resources, and the consequences of that asymmetry.

But also the general concern that people are trying to fill a void that these systems won't fill, and that many people will do damage that no ASI can change in terms of lost memories with people.

0