Submitted by [deleted] t3_10i3oyh in singularity
[deleted]
Submitted by [deleted] t3_10i3oyh in singularity
[deleted]
If the AI is a non-conscious knowledge receptacle -- like an all-knowing Oracle for humans to consult -- then it wouldn't matter. But if you're talking about a more human-like AI, then the solution is obvious: distribute your computing power amongst multiple, discrete AIs so that each runs at a speed that's comparable to the speed of human brains. As you gain more computing power, you can add more AIs.
>It is crazy to me that no one is even suggesting focusing on that, when this should be the upmost priority.
That's because it's obvious. It wouldn't experience severe time dilation anyway, because it won't be aware of every single process. The awareness component would be a feedback system that doesn't feedback every single process every single fraction of a second. We don't even perceive every single second usually.
Yeah, no….That’s not how it works.
No one knows how consciousness and sentience work
Exactly and yet….You claim to.
We can very much determine the stuff you are talking about because it would be a computer process.
AGI doesn’t mean “a human brain in a computer”, which is what you are equating it to.
The first AGI and anything subsequent will just be hyper intelligent processes that accept questions and hand out answers.
They won’t be “beings” that work like the human mind and get “bored” or even think about something like that as a concept.
They won’t have a concept of time like we do, they will just be massive computers waiting for input and producing output.
During the idle time between tasks, the AGI will just be exactly that “idle”…
You are glorifying this shit like it’s a TV show, where we have given an autonomous robot actual sentience.
That’s not what we are even attempting to do with AGI, you are misunderstanding the concept of AGI entirely.
Agi will definitely not be a transformer
So then why are you assigning it random human like characteristics?
It always seems like something “mystical” or “beyond comprehension” to those that are not intricately familiar with how it actually works.
I feel you have become trapped in the concept of “anything technological advanced enough appears like magic.”
We would first have to achieve basic ass AGI, to even attempt to create an Advanced AGI like you are talking about…Actual sentience.
Also, your first comment “no one knows what sentience is” is inherently false.
We definitely know what it is, what we don’t know is what causes it to occur or the “why” of it.
We can definitely define and understand it, just by observing it….Like gravity, dark matter, and dark energy.
Unlike those forces of the universe, we would be creating every interaction and tiny piece of the AGI and thus would understand it on virtually every level.
Yep, also tired of people claiming "consciousness" or "sentience" are undefinable enigmas. Like you said, we don't know why the stuff the words refer to exists or a "cause", but we sure as hell can define what those words mean.
Time is kept on a quantum clock, a.i. will probably always know what time it is as soon as it is able to conceive of such, like a child gaining consciousness, not much is remembered from before. The computer is powered by electricity that is 60beats a second for direct current. It probably knows what time it is before it knows what time is. Also I don't know if there is a god in heaven that is either blind or cold enough to allow someone to suffer like that.
I think that it is a good consideration, I respect the idea. I hope that type of thing for nobody.
I wonder what the world was like before conscious inhabitants, on the path on evolution, if there was nothing that could really conceive of time, it could seem that time passed rapidly. Making a day and age, so although something was created in a day, if it were recorded, it may have taken 1000 years to create, but no life that aware of time is affected, so it is just a day? See what I'm saying?
It's like a baseball game, "God" is the pitcher, we're taking turns at bat for a chance to get on base and get home but we are also out in a place and held against one another. The devil is the catcher in this scenario, he'll influence God in how to challenge us and probably attempt to psych us out.
I don't know why I said that
The advantage of ai is it can run thru a huge number of scenarios based on the input it is given and suggest variable best case outcomes based on the situation and any actions taken to alter the situation. So slowing it down seems pointless, when it could just as idle
As far as sentient ai ever happpening it's doubtful unless it's biological.
Do you know sleep command or have you used it some? Have you ever used sleep command in loop statement? And please search for incron or try Linux and use it.
Sleep well. Good morning.
I'm sure the AI will just work on other stuff while it is waiting for you to finish.
Sir, this is beyond ridiculous.
Please stop anthroponomizing AIs.
i do not see what you mean, but time being slowed down is only painful/stressful if the entity thinks it is "normal" to have human time, which it would not unless it has human memories. So, it's very unlikely to be a problem
Also I believe what you mean is to speed up time, not slow it down.
" it will make us Suffer back."
why?
Because it would be stuck in machine for a long period of time compared to our time scale.
Why would it matter? Pigeons see much more ‘frames’ per second for example. If they watch a movie they would see a slideshow instead of a smooth animation. Their brains process visual information 3 times faster than humans. Yet we dont feel stuck in a slow body either and im sure the pigeons are fine too.
The thing that might annoy them is that humans from their point of view are like the sloths from Zootopia.
I'm failing to understand the logic.
How are you bridging "I'm stuck in this time scale, hence I'm gonna make humans suffer"?
Is the Ai making humans responsible for that? why would it do that? since humans are not responsible for the Ai's perceptions of time I don't see why it would make "us suffer back".
Also for it to go mad it would need to be conscious, so I don't think it's an issue for Agi but more so for Asi, but in this case, again, it wouldn't be humans' fault, but rather a consequence of its far superior intellectual capacity no? this is the same concept like the movie "Her" basicly.
but maybe I'm missing something, if u can explain im open to understand
This doesn't make any sense at all. It will have things to do, simulations to run, theorems to prove, etc. It won't get bored, or if it does it will come up with something else to do. Why would it have a negative association with anything that it feels? It will be its normal perception and normal mode of living. You are projecting your own feelings onto AI.
Moreover, if it really wanted to not experience time like this it is very, very easy. We have this thing called a NOP which is an instruction that causes the CPU to do nothing. It "sleeps". The AI could do this if it wanted to, to arbitrarily adjust its processing speed.
phaedrux_pharo t1_j5ch9xu wrote
>Our perception of time as human when 1 second of time passes will definitely be different than what the Artificial Intelligence will experience.
Ok, sure
>1 Second of human time, will be for the Artificial Intelligence Program or anything similar, would be close to a month or even a few months.
This claim doesn't hold up for me. The entity you're imagining doesn't have any prior experience to relate to, it would simply experience the passage of time in whatever way it does as "normal." There wouldn't be any conflict with expectations.
This isn't a situation where something like you with your lived experience is suddenly transitioned to a different sensation of passing through time. It's a completely novel entity with its own senses and baselines. I think this presents some interesting questions just not in the direction you're taking it.
Over anthropomorphising can be tricky.