Red-HawkEye t1_j5cxxsx wrote
Reply to comment by ZaxLofful in It is important to slow down the perception of time for future sentient A.I, or it would become a living LOOP hell for itself by [deleted]
Agi will definitely not be a transformer
ZaxLofful t1_j5cz6ts wrote
So then why are you assigning it random human like characteristics?
It always seems like something “mystical” or “beyond comprehension” to those that are not intricately familiar with how it actually works.
I feel you have become trapped in the concept of “anything technological advanced enough appears like magic.”
We would first have to achieve basic ass AGI, to even attempt to create an Advanced AGI like you are talking about…Actual sentience.
Also, your first comment “no one knows what sentience is” is inherently false.
We definitely know what it is, what we don’t know is what causes it to occur or the “why” of it.
We can definitely define and understand it, just by observing it….Like gravity, dark matter, and dark energy.
Unlike those forces of the universe, we would be creating every interaction and tiny piece of the AGI and thus would understand it on virtually every level.
turnip_burrito t1_j5ege91 wrote
Yep, also tired of people claiming "consciousness" or "sentience" are undefinable enigmas. Like you said, we don't know why the stuff the words refer to exists or a "cause", but we sure as hell can define what those words mean.
Viewing a single comment thread. View all comments