Ivanthedog2013 t1_j76ui9u wrote

You make some good points. Ok, so what if we prioritize only making ASI or AGI that isn't sentient and then use those programs to optimize BCIs in order to turn us into super Intelligent beings. I feel like at that point even if the big tech companies were the first ones to try it that their minds would become so enlightened that they wouldn't even have any desires related to hedonism or deceit because they would realize how truly counter productive it would be


Ivanthedog2013 t1_j702ai6 wrote

this was the only logical progression, better for it to master abstract concepts and progress into more concrete jobs where the systems are predicated off of the abstractions.


Ivanthedog2013 t1_j6etogg wrote

Reply to comment by bluemagoo2 in I’m ready by CassidyHouse

i understand your logic but its a little flawed.

your assuming that in the next 1,000,000 years that humanity wont ever develop a smart enough intelligence to figure out a way to avoid the heat death of the universe,

considering how we are already seeing significant improvements in our ability to literally manipulate matter on a quantum level whos to say that once we figure out how black holes truly work and how they relate to dark energy or dark matter that we wont be able to take advantage of those systems to manipulate the entirety of the universe to eventually avoid the heat death of the universe.

im not saying its guaranteed to happen but knowing that it could potentially happen directly contradicts your logic that out inability to have control over our fate is equal to what our ability to do so would be 1,000,000 years from now. and that logical inconsistency validate peoples anxiety towards the thought of missing out on those intellectual transcendent/enlightening experiences.


Ivanthedog2013 t1_j6eqxvh wrote

im no computer scientist, but ive always held the belief that humans need to augment their psychophysiology to the degree that eliminates things like greed and deceit while maximizing empathy.

this needs to happen before people every try to implement radically different forms of economic and governmental systems


Ivanthedog2013 t1_j6eojfg wrote

Reply to comment by bluemagoo2 in I’m ready by CassidyHouse

no one here is afraid of death by itself, they are afraid of what they will miss out on in between now and the heat death of the universe.

i would gladly accept death if it means i get to spend thousands of years unveiling all the mystery's of reality.

but dying now knowing that i could miss out on all the epic sci fi like experiences is much a fate much worse than just being anxious of death alone


Ivanthedog2013 t1_j6en0ac wrote

i look at it this way.

the AI that is creating the synthetic art is the pinnacle of human creativity and the value i place on the ai art is the same if not greater than if it came directly from a person becuase its literally derived from all the humans that have ever created any art and it resembles the collective consciousness of humanity which in my mind is much more valuable/beautiful concept.