phillythompson t1_jdxl7l0 wrote
Reply to comment by Sashinii in The goalposts for "I'll believe it's real AI when..." have moved to "literally duplicate Einstein" by Yuli-Ban
They will say “but it doesn’t actually KNOW anything. It’s just perfectly acting like a super intelligence.”
Azuladagio t1_jdxp1jl wrote
Mark my words, we're gonna have puritans who claim that AI is the devil and doesn't have a "soul". Whatever that means...
Jeffy29 t1_jdynbwc wrote
I think Her (2014) and A.I. Artificial Intelligence (2001) are two of the most prescient sci-fi movies created in recent times. One with more positive outlook than the other, but knowing our world, both will come true at the same time. Like I can already picture some redneck crowd taking sick pleasure at destroying androids. You can already see some people on Twitter justifying and hyping their hate for AI or anyone who is positive about it.
MultiverseOfSanity t1_jdywvcx wrote
Interesting that you bring up Her. If there is something to spiritual concepts, then I feel truly sentient AI would reach enlightenment far faster than a human would since they don't have the same barriers to enlightenment that a human would. Interesting concept that AI became sentient and then ascended beyond the physical in such a short time.
stevenbrown375 t1_jdyb56p wrote
Any controversial belief that’s just widespread enough to create an exclusive in-group will get its cult.
Northcliff t1_jdz0199 wrote
well it doesn’t
Koda_20 t1_jdyedg9 wrote
I think most of these people are just having a hard time explaining that they don't think the machine has an inner conscious experience.
Thomas-C t1_jdyq0fy wrote
I've said similar things and at least among the folks I know it lands pretty well/folks seem to want to say that but couldn't find the words. In a really literal way, like the dots just weren't connecting but what they were attempting to communicate was that.
The thing I wonder is how we would tell. Since we can't leave our subjective experience and observe another, I think that means we're stuck never really knowing to a certain degree. Personally I lean toward just taking a sort of functionalist approach, what does it matter if we're ultimately fooling ourselves if the thing behaves and interacts well enough for it not to matter? Or is it the case that, on the whole, our species values itself too highly to really accept that time it outdid itself? I feel like if we avoid some sort of enormous catastrophe, what we'll end up with is some awful, cheap thing that makes you pay for a conversation devoid of product ads.
SpacemanCraig3 t1_jdyerfo wrote
if thats true it seems unlikey that those people do either.
MultiverseOfSanity t1_jdyyr0u wrote
There's no way to tell if it does or not. And things start to get really weird if we grant them that. Because if we accept that not only nonhumans, but also non-biologicals can have a subjective inner experience, then where does it end?
And we still have no idea what exactly grants the inner conscious experience. What actually allows me to feel? I don't think it's a matter of processing power. We've had machines capable of processing faster than we can think for a long time, but to question if those were conscious would be silly.
For example, if you want to be a 100% materialist, ok, so happiness is the dopamine and serotonin reacting in my brain. But those chemical reactions only make sense in the context that I can feel them. So what actually let's me feel them?
Viewing a single comment thread. View all comments