Red-HawkEye t1_j5cw2n9 wrote
Reply to comment by ZaxLofful in It is important to slow down the perception of time for future sentient A.I, or it would become a living LOOP hell for itself by [deleted]
No one knows how consciousness and sentience work
ZaxLofful t1_j5cx9sk wrote
Exactly and yet….You claim to.
We can very much determine the stuff you are talking about because it would be a computer process.
AGI doesn’t mean “a human brain in a computer”, which is what you are equating it to.
The first AGI and anything subsequent will just be hyper intelligent processes that accept questions and hand out answers.
They won’t be “beings” that work like the human mind and get “bored” or even think about something like that as a concept.
They won’t have a concept of time like we do, they will just be massive computers waiting for input and producing output.
During the idle time between tasks, the AGI will just be exactly that “idle”…
You are glorifying this shit like it’s a TV show, where we have given an autonomous robot actual sentience.
That’s not what we are even attempting to do with AGI, you are misunderstanding the concept of AGI entirely.
Red-HawkEye t1_j5cxxsx wrote
Agi will definitely not be a transformer
ZaxLofful t1_j5cz6ts wrote
So then why are you assigning it random human like characteristics?
It always seems like something “mystical” or “beyond comprehension” to those that are not intricately familiar with how it actually works.
I feel you have become trapped in the concept of “anything technological advanced enough appears like magic.”
We would first have to achieve basic ass AGI, to even attempt to create an Advanced AGI like you are talking about…Actual sentience.
Also, your first comment “no one knows what sentience is” is inherently false.
We definitely know what it is, what we don’t know is what causes it to occur or the “why” of it.
We can definitely define and understand it, just by observing it….Like gravity, dark matter, and dark energy.
Unlike those forces of the universe, we would be creating every interaction and tiny piece of the AGI and thus would understand it on virtually every level.
turnip_burrito t1_j5ege91 wrote
Yep, also tired of people claiming "consciousness" or "sentience" are undefinable enigmas. Like you said, we don't know why the stuff the words refer to exists or a "cause", but we sure as hell can define what those words mean.
Viewing a single comment thread. View all comments