Future_Believer

Future_Believer t1_j3s0sgh wrote

I am a people and I don't love these polls. However, this one did generate a query within me that I am willing to externalize and listen to the thoughts of others.

I am not currently a programmer nor have I done any actual programming in a couple or three decades. I am not in an industry that stands to benefit or suffer from AGI's realization. I do pay attention to tech and science trends but only as a casual observer.

So why should anyone care what timeframe I believe is the right one for the onset of actual AGI? Is there an action that will be taken should we reach consensus? What will change if the poll goes all one way or the other?

2

Future_Believer t1_j25dtxw wrote

I think you are taking an unfortunate shorthand statement as a detailed analysis. It aint.

I don't think there will soon be guards outside of factories to fend off the hordes of humans wanting something to do. What I think will happen is that capitalism, as implemented in the current idiom, will cease to be a desirable influence. The book, Bullshit Jobs by David Graeber should help understanding my position. For those that just have to go to a place and do a thing, that will remain an option. But no one will indulge them in their fantasy that human society or environment actually needs them to go to a place and do a thing.

As a general rule, all of evolution is reactionary. Something changed whether anything wanted it to change or not and the evolution is the response to that change. We will evolve away from jobs because shit will change - whether we want that change or not.

1

Future_Believer t1_izycsea wrote

I suspect I am significantly older than you. Allow me to clue you in to the fact that as a general rule, societal progress is not dependent on whether a given individual wants that progress.

You haven't said what sort of scientist you would like to be. But regardless of what field you are interested in, it seems horribly self-centered and callous to demand that those who suffer waiting for a particular advancement of science should have to wait for you rather than accept relief years or decades earlier by letting a Manufactured Intelligence address the issue. That is the likely scenario in biology, chemistry, materials, agriculture and probably all the rest of them.

I have done strenuous physical labor and in the midst of it, I have known those that had to work twice or more as hard as I did. As long as it does not hurt his family, the guy in the 95degree sun with a shovel in his hands doing back breaking work would be happy to let a machine take over. I bet that would be true even if it was you. Your idea of work likely involves HVAC and other comforts. That is in fact the case for a minority of the people on the planet. Perhaps you should spend some time with them (us).

3

Future_Believer t1_izq8lv3 wrote

A good friend of mine who is no longer with us but who at one time managed the best performing mutual fund in the USA was fond of defining capitalism as the (re)distribution of scarce resources.

I don't know if he came up with that himself or if it is a standard definition but I have heard others say very similar things.

Nanotechnology has long been understood as the death of material scarcity (or the death of humanity if implemented badly but that is a discussion for another thread.). Robotics will eventually remove all necessity for human labor. I would imagine some humans will continue to work but there will be no necessity for them to do so. Manufactured Intelligences (or AI if you must) will exceed human capacity and will be responsible for engineering and research & development. Mining and smelting and various other dirty industries will likely be moved off planet as soon as that is feasible.

Where exactly do you see room for capitalism? There can be no Marxism without a labor force to protect. All of our current systems of economics will be found wanting if the things we discuss come to fruition in any meaningful way. It isn't that folks here are opposed to capitalism (though they could be), it is that there is no way to arrange the Vinge, Kurtzweil, Stephenson, Asimov, Roddenberry future so that it includes capitalism.

1

Future_Believer t1_iyberxj wrote

I don't think you have a significant lack of understanding. If you lack anything it is the ability to consider those things in a different context, i.e. the context of the probable future.

Of course, there is a very good chance that the future will be something other that what I or Roddenberry or Kurtzweil envision. If the Manufactured Intelligence my prognostications are predicated upon fails to be brought into existence, I will be laughably wrong. OTOH if a couple of things happen more aggressively than my intentionally conservative guesstimates, I will be laughably wrong in the other direction.

I did not predict the advent and take up of mobile phones/computers. The world wide web was a bit of a surprise to me. Hell, I thought few people needed or would spring for a color monitor. I'm guessing. I hope it turns out to be an educated guess I can't advise betting your lunch money on what I think is coming.

1

Future_Believer t1_iy6yh94 wrote

First off, there is not actually any thing you can point to and say, "son,that there is the gub'mint". There are of course, around the world, millions of humans who participate in whatever scheme of governance the locals will accept. As such, your opening TL:DR question could be quite reasonably asked of people. I know it sounds like I am just being pedantic but it is relevant to the way one thinks about these things. It is people that will have to adapt to the development and deployment of Manufactured Intelligence.

Unemployment is coming for everyone. IMNSHO that is a good thing. However, humanity has a long history of increasing the population as a source of workers. Once a given culture surpasses the need for additional workers, it usually results in some manipulative powerhungry ass figuring out a way to make money from keeping the poor people having children.

The economy is what those manipulative powerhungry asses are trying to control. But the truth is once robots and computers are doing all of the energy generation, energy distribution, research, farming, manufacturing etc etc, there is no NEED for an economy. There will be no good reason for anything to cost anything. It may take a while but eventually this will be obvious to anyone wiling to think about it.

The societal issues you mention suggest to me that you are not thinking about the Intelligence in charge as being intelligent. Also, theft becomes ever more rare as we get closer to the economy I suggest we are inevitably heading for. (unless we destroy humanity first but, that will eliminate theft as well.)

Right now we need to figure out a way to describe the coming future in such a way that is not only not scary to the hoi polloi but is actually desirable. Unfortunately, I have no real ideas to offer on that.

2

Future_Believer t1_iuzkpg2 wrote

Excellent question.

Looking at the current situation doesn't add a lot of clarity but it is worth trying anyway. Most of the non-tech "elites" that I observe don't actually seem to be aware of the exponential advance of technology. They act on individual developments when they think they can make money on them but for the most part, they plan and act as if the way things are is the way things will always be.

The tech "elites" appear to be distracted by other things. They want to go to Mars or cure disease or develop the next big social media site. We don't actually know what it will take to get the attention of either group.

According to what gets developed when, there is a solid chance that money will become largely obsolete. Humans will be able to have anything they could possibly want whenever they want it. Work will be done by robots. I have no idea how elites might maintain that status. I know they will want to but what will the medium of exchange be?

Politicians and/or would be kingmakers. will try to pass laws that keep them in power but about the time a Manufactured Intelligence is certified as a judge, that sort of behavior might bet more difficult.

As an aside, I have a difficult time believing that it could be as long as 5 years from the time AGI is obvious/imminent and the time it is pervasive. This is important because corporations are a bit like aircraft carriers in that you cannot turn them on a dime. It will not be possible, even with hundreds of billions of dollars at hand, to restructure a Berkshire Hathaway, a Goldman Sachs, a Meta or Alphabet, a Booz Allen, to take extreme advantage of whatever comes up.

3

Future_Believer t1_istpgtl wrote

Not to be argumentative for the sake of being argumentative but, my mere inability to state specifically where I saw or experienced something, or even just the seed of something, doesn't mean I absolutely never saw or experienced it.

Let's say you are hiking in the wilds and you come upon an actual version of the old movie trope - a human child raised by wolves with no other human contact from early on in its infancy. As an experiment you ask that child and 100 others of the same age but that had been raised in any of the global cultures with access to internet and movies, to imagine something that in theory, none of them had ever seen or experienced. I would expect the wolf-child to present significantly different answers than the more traditionally raised children. I would expect there to be some level of similarity - however faint - amongst the answers from the traditionally raised children.

It sounds to me like you are saying that my expectations would not be met. That all of the children would come up with equally irrelevant and inexplicable concepts. If so, that would change my thinking. OTOH, if there was an element of similarity, however slight, among the traditionally raised children but not the wolf-child, would that not suggest at least a common seed of an experience or exposure?

I don't think imagination lives in a vacuum. The connections may be tenuous but I suspect there are some there. I have no idea how one might practicably test my theory.

2

Future_Believer t1_isrpxub wrote

I can imagine plenty but it is all rooted in or based on things that I have experienced in some way. I can't imagine certain aspects of quantum physics - especially not well enough to draw them. I can't imagine jellyfish respiration. I can't imagine the chemical structure of Brazil nuts.

I have plenty of imagination. I have written in the past about imagination and I find it interesting that a Manufactured Intelligence is currently able to pretend to have an imagination that well.

Unless I have missed your point, there is nothing to be sad about.

2

Future_Believer t1_ispgqs6 wrote

Interesting. If you asked me to imagine something, I would struggle to visualize something that has never(to the best of my knowledge) existed. I would draw on my several decades of reading, traveling, watching videos, conversing with others and dreams to complete the task. What I came up with might be an amalgamation of things experienced or might be just one where the image stuck with me but the source did not. The fact that the Manufactured Intelligence was able to do all of that on demand is not a disqualifier in my view.

For now it looks to be of fairly limited utility but that limitation may well be mine(yours). I suppose the programming language for MIs in the near future will consist of nothing more than asking the right question.

1

Future_Believer t1_isgfcow wrote

You may be a bit too narrowly focused on one specific manner of use for the tech. AR contact lenses would have been great back when I was a Field Service Engineer - as long as I was working in hospitals. But I also had machines on the Y-12 reservation in Oak Ridge, TN. There I would have essentially had to remove the contacts or face a court or three.

Instead of contact lenses I would probably opt for a ultra-broadband data connector that could accept a variety of sensor inputs and allow one to visualize relevant information without additional prosthetics. Such a sensors could be disconnected and the connector blocked if physical security demands it.

There are actually millions of citizens that have access to sensitive classified information. (not to mention proprietary industrial information) So the need to address this situation will come up sooner than later. By the time this tech filters down to the consumer (I'm assuming it will begin as a corporate tool.) there will likely be a fix in place already.

1

Future_Believer t1_irptey8 wrote

As any good lawyer would tell you, the answer lies (at least in part) in the definition of your terms.

I always say that I want to be the entity in charge of an interstellar spaceship. Whether it is purely my consciousness that is uploaded or my physical brain is wired into a ultra-mega computer the idea is that my body is replaced with an incredibly complex and capable spaceship.

At that stage of the game, what would a romantic relationship look like? With no genitals and no endocrine system, the fact that "I" had a near infinite number of sensors feeding me information from all of the vessel's subsystems would be of interest but, would not make romantic love in my current human idiom feasible.

We prejudice ourselves with the misnomer "Artificial Intelligence". I generally refer to it as "Manufactured Intelligence" because the actual intelligence will be quite real. I cannot say whether such an intelligence would be capable of loving me but, I see no reason why I would be incapable of developing affection for that entity.

My dream of being a spaceship is not the only dream. As far as we know every planet, dwarf planet, moon and asteroid is unique. They will all "need" to be explored if the goal is expanding human knowledge. However, the frailty of our current bodies is a limiting factor. We could quite reasonably have billions of humans in more durable form exploring throughout the galaxy but we could also have just a few and all the other explorers might be run by MIs. We might not even think to ask if a given Intelligence is "human". I suppose we also might evolve away from a need for love once we have no physical reproductive need.

At some point we will have to stop trying to evaluate an advanced world through the lens of cavemen with technology. At some point we will have to acknowledge that things will change drastically enough that analyses based on the current human physical idiom will be of no use whatsoever.

22

Future_Believer t1_irglxwd wrote

Why?

Seriously, you are asking for an opinion but not for qualifications or reasoning or educational level or educational focus or applicable hobbies. What will you do with the answers?

I consider the promulgation of usable information to be a reasonable thing and quite possibly, a positive thing. OTOH, the sharing of opinions without regard to foundation is IMNSHO unlikely to produce any positives.

I do not think you should base your planning for the future on my opinions. You may wish to consider Michio or Ray or some of the others whose life focus has been studies of the probable future but, I fail to see the utility of a collection of random internet opinions.

However, I am willing to listen.

−7

Future_Believer t1_iqodt02 wrote

Like so many folks on these subs, you are postulating a single isolated change and contemplating its effect in the current technological and sociological idiom.

That is not how this will work.

As changes have come in the tech world, large changes have also come to human society and human economies. A lot of effort has been put into keeping things as they are except for the isolated tech advancement - basically an effort to make your question prophetic. Such efforts may slow the changes in other sectors for a while but ultimately they will fail.

Until you can begin to consider the entirety of the system the technology exists in and how its inherent dynamism might be affected by any given change, you will be unlikely to correctly prognosticate. We do not exist in a vacuum. Whether you see it or not, stuff is connected. If you think about it that way you can probably answer your own question.

7