Capitaclism

Capitaclism t1_jabc9az wrote

Sorry to break it to you, but there will always be work. Always.

We will either merge with AI and work at exploring the universe, or we will be rendered obsolete and wither away, our civilization collapsed, as we toil at the fields or hunt and gather for survival with whatever is left of our crumbling technological gear.

Either of which will require a whole lot of work.

​

The only scenario where we have no work is the one where we go extinct. There's no scenario where a super human general intelligence which is infinite (by all practical standards relative to human beings), essentially immortal and exponentially growing chooses to be perfectly subservient to lazy human slobs. If you want that go watch Wall-e.

0

Capitaclism t1_jabajgk wrote

Thank you for sharing yours as well.

Mine comes as an investor, and also owner of a few different businesses, one of which is tech related, where I own a few IPs.

I wouldn't invest in anything without a clear and substantial return which likely involves ownership of some sort, including IP when appropriate.

Other investors I know think similarly, or they'd have very short careers, so take it or leave it.

​

Good luck.

0

Capitaclism t1_jaard23 wrote

Wrong. IP is what makes the wheel of investment move to create more IP. Remove the incentive and you will find progress slowing to a halt. Who in their sane kind would put money into a venture they don't own?

They're just taking shortcuts. Watch them hoard and protect IP once they develop it. Everyone wants to be on top, that is the way of the world, no different for China than for the US, or any other country... Just don't use that to justify stealing...

0

Capitaclism t1_ja61i4c wrote

No one here truly knows when it'll be possible, though we can speculate. Some will be more conservative than others, but let's not pretend we can truly see anything past then next 5-10 years when we're advancing st such ridiculous speeds. Some things which seem simple will hit hard snags and take much longer than expected and others which seemed to be hard problems will come easily.

That's the way of life.

2

Capitaclism t1_ja3hvim wrote

I have worked with both housing and tech. I believe he must be thinking of nanobot equivalent robots, or other small bots which can gather resources and synthesize materials. I can see how in some possible future this could be done with the foundation and overall structure, but have a harder time understanding plumbing, electrical.

Go far enough into the future and anything is possible, I guess. Sounds like Sam was vague enough to allow for these far out possibilities. Either that or he lacks even the most basic understanding of how to build housing.

10

Capitaclism t1_j8gvfi4 wrote

Sort of, yes. It's the people behind the acts without awareness which cause cruelty and harm. In this case, though, it could be wholly unintentional, akin to the paper clip idea: Tell a super intelligent all powerful unaware being to make the best paper clip and it may achieve do to the doom of us all, using all resources in the process of its goal completion.

I think as a species I don't see how we survive if we don't become integrated with our creation.

2

Capitaclism t1_j63lm7w wrote

it depends on whether we use robotics to fully automate factories, countries decide to share resources more freely, make shipping costs far lower, and other such factors. There are a LOT of barriers to be overcome on a path to lower supply costs. Currently we're headed in an opposite direction over the mid-long term, with countries trying to actively deglobalize.

2

Capitaclism t1_j61fr93 wrote

I think you have missed one consideration. UBI addresses the demand side, that is, the income people have. But income is meaningless without context- the supply side needs addressing. As it is, no country is fully self sufficient and must trade goods and services with one another. The US happens to be in a more advantageous position in some regards geopolitically & in terms of natural resources, but AGI has to address the issue of supply or the income is meaningless. More income with the same supply simply translates into inflation in the cost of goods and services, requiring ever increasing UBI values which won't ever catch up with cpi, ultimately resulting in runaway inflation. What is needed is a much larger supply of goods and services, keeping prices deflated and resulting in abundance. UBI without this is meaningless. AGI can't simply take over most jobs, it much create an over supply of the things we need, acquire the resources from different geographical locations and distribute them efficiently. Once we have such breadth of precise control over global economies, UBI could be rendered a little meaningless, as the cost of goods and services drives towards 0.

Just something to think about, since people often miss this supply point when discussing stimulus & UBI like measures.

2

Capitaclism t1_j5d5n9e wrote

You don't think AGI will render a ton of those businesses obsolete as it drives towards the singularity? That is what the G there signifies after all.

It will be able to do many of the same functions humans do, but gradually far more competently and at a much greater speed, 24/7.

Businesses won't simply disappear overnight but the writing would be on the wall once we start getting close enough to become more generally understood.

Timing the market for the way out is hard, so while leaving it in a mutual fund for now is fine, at some point we may see great disruption.

9

Capitaclism t1_j3pfcd9 wrote

Right. There's also the possibility it is impossible to truly merge. That we can out bits and pieces in our skull, but AI would simply dominate it, rather than merge. We don't really know if our awareness can fully merge and be expanded by this foreign intelligence, but I do suspect it can. No one knows for sure.

1

Capitaclism t1_j3p9bqf wrote

I hope we do merge as well, since I don't think outcome 3 is super likely. Why would a general super intelligent being choose to be subservient when it can surpass the collective intelligence of all beings on the planet as it grows exponentially towards, for all purposes in human scale, infinity?

2

Capitaclism t1_j3kr49j wrote

You are assuming everything can be discovered by any one thing, and that it wouldn't literally take all of the energy and potential computational power present in the universe to finally fully understand it.

Anyway, I take it you mean what would happen if AGI simply renders human beings obsolete.
Well, there are a few different likely scenarios here:

  1. We merge with AI long before that happens, de facto becoming AGI. This is potentially a pretty benign scenario. We then spread through the universe
  2. We don't merge with AI, remain fairly separated from it, it renders us obsolete but turns out to have goals misaligned with ours. Two likely scenarios:
    1. We get annihilated
    2. It treats us like meaningless "ants", takes needed resources to leave and we stay here to likely die off slowly
  3. We don't merge with AI, remain fairly separated from it, it renders us obsolete but it remains aligned with our interest, thus making all of our dreams come true. We each depart with a form of AI to spread through the universe. Personally I don't think this is a very likely scenario
11

Capitaclism t1_j16yqmp wrote

A slow increase in the supply of lqbor, though possibly without the desired increase in demand for said labor.

One of the issues is the speed with which this is about to happen. It likely won't give economies time to properly adjust.

Usually when you have an increase in the labor force you can get increased GDP output and higher supply of goods and services. This puts downward pressure on prices, and demand for those goods and services generally increases, as they become more accessible. This extra demand inntrun generates further enttepeneurship as people seek to meet it. But if it happens too fast that cycle may substantially lag behind the increased output without creating more demand for labor.

6