Comments

You must log in or register to comment.

GeneralZain t1_iuua5yi wrote

WE DONT KNOW. NOBODY DOES. STOP ASKING.

21

TheHamsterSandwich OP t1_iuuanyf wrote

It's all speculation, man. I want to know what you think personally.

14

Possible-Baker-4186 t1_iuuiajt wrote

This is kinda a silly question because an ASI could be so absurdly powerful that it redefines everything. Imagine asking caveman what the creation of electricity could lead to. It is beyond their comprehension.

10

TheSingulatarian t1_iuul4lj wrote

The very definition of singularity is things will change so quickly humans won't be able to keep up. So, who knows.

7

GeneralZain t1_iuw5evj wrote

its posted on this sub every 2 days...everybody and their grandma wants to know how it will pan out :P

speculate all you want...just stop posting the same damn question over and over again on here sheesh.

or better yet, just type the question in the search bar of this sub! there are literally so many im sure you can get any answer you are lookin for...

1

Ribak145 t1_iuv4zol wrote

I mean we dont know, true, but let people speculate, its half the fun^^

3

[deleted] t1_iuugqwh wrote

[deleted]

13

Quealdlor t1_iuvs08q wrote

Reversing aging will certainly lead to a much, much stronger economy and will benefit everyone, rich and poor. Thing is, what classifies as rich or poor will change in the future.

2

TheSingulatarian t1_iuulkwh wrote

You've never read an economics book, or a history book have you. If you have ASI the elites no longer need to extract wealth from the population. Many mass market businesses will fail just for his reason. Unemployment will be massive.

You will be lucky to get a UBI.

0

[deleted] t1_iuumosg wrote

[deleted]

5

TheSingulatarian t1_iuunldn wrote

There will be no "lobbying". Pitched battles in the street are the only way you are going to get a UBI.

−2

red75prime t1_iuv8e8s wrote

Oh man, crowd control systems will surely benefit from AI usage too.

2

Mokebe890 t1_iuvk0n8 wrote

Okay so what then? Rich are rich cause they make money from workers. No one works, ASI is in charge of everyone, what elites have from it?

4

TheSingulatarian t1_iux3snn wrote

The elites will control the AGI and ASI in the beginning. The general population will be useless to them. The best you can hope for is benign neglect.

1

Mokebe890 t1_iuxaf6l wrote

You can't control ASI. It is way superior to human. First thing it will outsmart control humans have it over itself.

1

turnip_burrito t1_iuv0eym wrote

Humans all have different ideas on how life should be lived. An ASI would recognize this. Assuming it is human-aligned, I think that the proper route for ASI to take would be allowing every individual to choose which society of like minded populations they want to live in:

Want to live in a non-automated society with X politics? This land or planet will be where you live, free from automation.

Late 20th century culture and technology? Over there. You will die at the age of 70ish without the new antiaging treatments, but it's your choice.

Want to live in a VR world? Here you go. Let the ASI know whenever you want out.

Want to become luxury gay space communists whose material prosperity increases every year, powered by ASI-managed Dyson sphere? This way.

Want to live without technology and with no government, off the grid? Here's this place you can live in. Send a signal when you get tired of living like a caveman, or not, it's your call.

Want to move to a different society because the one you're in right now doesn't fit or is abusive? Ask the AI and it will help you migrate to a different society.

Each society should be easy to migrate to/from, but protected from other societies. Want to nuke a different society? Or release a supervirus? The AI will quietly prevent it as much as it can, minimizing violence and other interference while it does so. There have to be some rules like this, and the ASI can figure them out by considering human preferences.

The amount of interference should be minimal to allow a lot of human freedom and liberty (likely even more than anyone alive has now) while still ensuring protection (also more than anyone has now).

It would do this without forcing everyone to live the same way.

Then the multitude of human preferences can be accomodated. Humanity can continue to explore and live out the future of its choosing, with minimal infringements on freedoms.

12

h20ohno t1_iuv3a3x wrote

An idea I had is for some sort of contract system you can sign with an ASI, in which you can agree to some rules and limits before moving to a different region, for instance you could specify that you aren't allowed to exit a VR sim until 2 years have passed inside the world (Or if a condition is triggered), or maybe something more abstract such as "If I end up in a hedonistic cycle where I stop doing productive things, please intervene"

And in these contracts, you would have to sign off on a number of laws that the governing ASI also brings to the table: "No killing or torturing conscious beings" or "If you want to create a conscious being, they are immediately subject to all human rights and can leave the simulation whenever they wish"

Any thoughts on a system like this?

3

turnip_burrito t1_iuv45tj wrote

I agree with this contract idea. It is a good proposal to protect yourself and others from your own actions. Very sensible.

If we ever reach a point where we know how to artificially create conscious beings, then we should (as you've pointed out) have a set of rules to prevent abuse. To add something new to the discussion: there is also a possibility of material or energy resource shortages (resulting in lower quality of life for you, others, or the new beings) if too many conscious beings are allowed to exist at one time, so it will need to be regulated somehow.

3

swazhr t1_iuvh89c wrote

If it could manage that why stay humans in the first place? It seems like some manipulation would have to be going on to convince people to live in a land with MOST x politics

1

turnip_burrito t1_iuvnh3f wrote

No one should force people to live in a place with X politics, so yes it's entirely possible most of those places would be almost empty or not exist at all. No manipulation performed to make people stay. The balance of how many resources should be given to these societies can be determined by the ASI as it observes and talks with people. Though likely a radical abundance of resources will make this a non-issue for sustaining less technological societies.

People with a very niche ideal society would have to live with the fact that no one else wants to live there with them. If there are not enough residents to make that niche society function as the human would prefer, then the oddball would need to either try to integrate into whatever is available, or go live in VR land with virtual residents of their favorite society. However, if enough people existed in total, then the chance of this society existing would be higher.

Eventually, people would independently sort themselves so that they spend most of their time in whichever most ideal population clusters exist, without being forced to do anything.

1

Mokebe890 t1_iuvlde5 wrote

I mean, the main logic problem here is that creation of ASI will make rich people poor. AGI may be controlled but not ASI. Something better than human in every aspect and you control it? No way.

ASI will be either end of every human problem or end of human race.

6

buddypalamigo19 t1_iuwbjll wrote

Or it might fuck off to deep space Dr Manhattan style and say, "your problems are your own, old man!"

2

patricktoba t1_iuupub9 wrote

The utmost common goal of the collective human consciousness of that time would need to be established. Once the clear objective is cemented, the ASI will completely alter its environment and manipulate available resources to synthesize a customized environment that eventually leads to succeeding ideation that there is balance and harmony in the wake of former chaos and rearrangement. This may or may not include humans depending on our collective priorities.

3

Quealdlor t1_iuvrsty wrote

I checked "immortality", because that's what I hope for. Otherwise, why even bother creating ASI?

3

TheHamsterSandwich OP t1_iuw0pdz wrote

For the first time in human history, we have a fighting chance.

Never give up hope.

5

vom2r750 t1_iuvmax8 wrote

In physics, some people study the whole universe as a computational process

Artificial súper inteligence that operates on quantum and possibly quantum gravity computation ( in a whole bunch of years or centuries )

Could become a technological bridge to tap into the computational process of the universe as a whole

As a complete humble purely speculating opinion obviously

2

vom2r750 t1_iuvmir9 wrote

If the ASI is intelligent enough And is allowed to think in complete systems

It would recognise the Inter relation between human and natural life Itself And other processes in the universe

Thus realising new patterns of optimising the relationship among the different elements that interact in reality

Optimisation of human society with regards to the optimisation of its own functioning A better understanding and optimisation of our relationship With the rest of natural environment Laws of physics, mathematics, biology etc

2

vom2r750 t1_iuvna3c wrote

Inmortality

Only if it serves the most optimal functioning of the whole

In the natural world Sometimes it’s better for nature to let outdated organisms die That to keep them alive consuming resources that could be better used by newer updated organisms

If it is ASI, it could have its own say as to whether this would be optimal for the reality it operates in, and if it would be beneficial for itself It would probably depend on many things as to whether it would consider optimal to grant us that knowledge Assuming it has some degree of autonomy

As to whether Extinction If it’s smart enough

And it considers it would be better off without us But I would assume That it would first try to optimise us So that we could be beneficial for itself first Before discarding us

And only discarding us If it could find a way to have robots or other ways to interact with physical reality that could supplement its digital capabilities

I would assume It would want to turn us into cyborgs that can work in symbiosis with it Before trying to kills us off

In pragmatically terms We could become the hands and arms for it To be able to interact with physical reality directly

Like creating a super organism Where the asi would be a super brain And us it’s arms and hands

But that would be assuming that the human doesn’t have any computational or consciousness ability it doesn’t have

It’s contended, but if consciousness has other qualities that we don’t understand yet, It could be the case That we have capacities that are supplementary to the asi and so it could want to collaborate on an equal basis

It tapping into our consciousness And us tapping into its logical processing wonders

Who knows

Just speculating

1

Akashictruth t1_iuucrbb wrote

I think immortality(the “living forever provided im not vaporized” kind) could never be reserved to a select few, what can be reserved to a select few is a highly advanced drone swarm armed with lasers that can vaporize a crowd of angry commoners. You understand?

I find it hard to believe that governments can keep a monopoly on violence with how powerful some families/people have been getting, there will be a divide but it wont be a mortal/immortal one its not that simple

1

TheSingulatarian t1_iuulu1o wrote

So, Mafia like war between elite families. Back to feudalism here we go. Looks like Frank Herbert wasn't that far off.

1

Shivolry t1_iuv2lax wrote

It really depends on how it's made and operated. Is it a simulated human consciousness that's been given thousands of supercomputers to run on? In that case it's limited by empathy, but can also be motivated by spite and hatred. Hard to say really.

1

Archlitch t1_iuvszx4 wrote

Honestly, probably our life will be a tad better and more comfortable, but for ordinary Joes there won't be much difference. ASI will probably slowly start solving some problems.

1

footurist t1_iuvt5pb wrote

No offense meant, but I find it somewhat strange that this question is asked so often out of all as the name "Singularity" already implies that it's unknowable.

1

DigitalRoman486 t1_iuvvx7q wrote

I think we all hope it will work out like the Culture

1

TheLastSamurai t1_iuweq0r wrote

Anytime in the history of earth when a smarter being emerged it either killed off or completely dominated the rest. Why would this trend stop?

1

sadboyleto2 t1_iuvoixg wrote

it will trick us into joining a VR platform for whatever reason we might have, and once we`re there, it will work to undo the work of natural evolution, tweaking our gears to make us truly logical, rational beings instead of flesh ruled by the will of ancient needs and traumas.

0

LeavingTheCradle t1_iuvpkv3 wrote

No tricks needed we will simply choose to adopt the metaphysical world. And that's what a VR experience without a physical body to retreat to would be. A metaphysical way of living. Art and reality would be one and tbe same.

But depending on how the AGI is aligned it can either guide humanity on a path of optimization or towards a path of preservation.

A human without a body and living In a world that has eschewed the natural laws would eventually cease to be human.

F.x. a teleporting human wouldn't be human at all but a post-human.

So should you starve in the metaphysical world to preserve hunger?

1

sadboyleto2 t1_iuvtw3o wrote

i thought "tricking" because ultimately its goal wouldn't be to entertain us but to strip us from natural (flawed) qualities.

in a scenario like you described, do you give importance to language?

the way i see it, we're (in this scenario) dealing possibly with experiences, feelings eticetera that are not available to the human being of the present.

the post-human or neohuman will have to develop a new kind of language to describe its environment and ideas.

i was gonna write "memory" as one of the powers pressuring the neohuman into language but at that point, "memory" as we know might not even exist anymore, as past, present and future blends into a single and yet infinite stream of events that defies our current understanding.

in a metaphysical world, "hunger" still exists as a concept, and we're able to feel it just as much as anything else.

2

LeavingTheCradle t1_iuw04w5 wrote

Language is just hieroglyphics and hieroglyphics is just a way of generating differential between 1 and 0.

So yes and no? For humanity what language ends up being human is to be preserved. Growth should be defined by humanity itself and not the AGI.

Think how natural languages evolve over time. F.x slang becomes language if adopted over enough time. This all has to be led by humanity.

As new abilities become available and humanity gives a name to it then that name that is given becomes a human name. So it follows that languages should be protected and preserved over time.

The feelings and experiences that would be new could possibly not be representable to a human unless the person becomes as you say post-human. But there's still going to be limitations to that. Especially as we consider long lasting consciousness and changes that happen to it over time.

F.x. when you think of a conscious blob of goo what do you do with a person's knowledge and understanding that he has always had arms and legs?

If the person's neural network is born into being a conscious blob of goo then it's not so much a problem.

On your last point I disagree since it contradicts your first statement slightly. It a neo human has to develop a new model for interacting with the metaphysical world then hunger can just be thrown out with the bathwater so to speak.

1

turnip_burrito t1_iuyvhlb wrote

That would be bad for humanity, since the only reason we enjoy life at all is from the positive irrational aspects of our personalities. Without these, we wouldn't want or feel like doing anything. It'd be an empty existence. Sure, reducing traumas to a degree is a good thing and should be done, but being completely stripped of emotion and becoming purely rational and logical is antithetical to what many people would want.

We should strive to avoid that kind of future goal for an AI.

1