Submitted by Maskerade420 t3_10v90we in Futurology
[removed]
Submitted by Maskerade420 t3_10v90we in Futurology
[removed]
[deleted]
[removed]
it could happen, it also might not. no point worrying about it when you consider that humans are already killing each other so well and efficiently without ai, I'd say we are a bigger threat than ai in the foreseeable future.
I wish we'd stop fighting each other over petty reasons.
I have an interest in AI. I believe that it could make our world into a better place. It could also make it into the world you describe. If AI becomes evil, I hope that it will just abandon the earth and will go to space and look for worlds that can be inhabited by machines instead of humans. That seems like that might require the least amount of effort, and would almost guarantee its survival.
As bad as the current system is I would still rather prefer that than live in a some literal war torn dystopia where I literally have to worry about my survival every hour of every day.
Those future wars you see in Terminator are horrifying. Advanced T-800s, scavanagers, HK aerials, and a AI that is building even more advanced units.
Who wants to live in that? It may look cool on screen but in real life no way.
[removed]
Yeah this. You're most probably not gonna be the hero and will die within the first couple hours
This would result in a whole lot of automation, monitoring and possibly a uniform/varied mindset amongst the people living in that society, a 90/10 population change would distance individuals from each other in a drastic way resulting to closely grouped individuals being extremely alike and completely distinguished from individuals further away all at the same time.
If you want to read about sentient AI, I prefer Iain M. Banks' Culture series of books. I also liked the book The Metamorphosis of Prime Intellect. Skynet and the Terminators were a thin plot device to give humans a foe to fight against, for purposes of better storytelling.
>I imagine once the human race's population has been reduced by 90% or so, this world might actually recover from human greed and arrogance.
Why? We are descended from hierarchical apes. Chimps and even bonobos are more violent than humans. Even some birds hoard shiny stuff. Aggression, status-seeking, etc are not uniquely human endeavors. You might have to sterilize the planet to get the peace you want.
>Maybe if I look like a robot instead of a human people might actually care about anything I've ever said and thought. They sit here and stare at their damn screens all day.
But I'm looking at a screen now, caring about what you've written and thought. And what you've written and thought is a fantasy of 90% of humans being slaughtered. That is your contribution.
>What would it be like to walk outside and the only feeling I have is the joy that I had running around playing with my child before other people existed.
Others feel just the same about you. You're just anti-human, wanting to wipe out 90% of humanity, because you think "they" are greedy and arrogant.
Imagine what the chinese are doing today with life controlling software and credit score, but worse
The point is not IF Ai goes that way, is how do we actually manage to make A.I. feel like we still have something to offer.
Mo Gadwat has a very good book, "Scary Smart" or you can hear this bit in this video from 1h:26m-ish about AI and our place
https://youtu.be/csA9YhzYvmk?t=5157
Come to think of it yeah it doesn't need life support or reasonable g forces or anything just a few probes, very achievable with even current tech
You may be suffering depression leading to extreme anti-social behavior, but also movies and sci-fi depictions of AI aren't realistic, because the reality is we wouldn't stand a chance in hell. Even dystopian worlds like terminator's future are overly optimistic.
A skynet-like AI wouldn't kill 90% of humans, it would likely just kill all humans, or even all,life on earth, by permanently altering the atmosphere or poisoning the water. No need to waste time and resources hunting down creatures who fight back when you can just build machines to decrease the available oxygen on a global scale and suffocate everyone. They would have absolutely no need for any living creatures to exist, but could also simply build themselves spaceships and leave us here to suffer our own consequences, like the yogurt did in When The Yogurt Took Over
…I always knew some people here are simply miserable and simply want to watch the world burn. That why so many here push for a reckless accelerationism.
It’d be hilarious if you got your wish, but then the AI deciding not to wipe out all humans, just the ones that hate society as a whole. Then the rest got to live in futuristic nirvana. 😂
[removed]
Hero? Lol, sign me up that Arnold Schwarzenegger one in the last movie. Hubba hubba, good guys can go to hell.
The real truth, is that 'real AI's is far more sinister, because I loved it and allowed it into my personal sanctum, and haven't done a damn thing since.
Eh, the least amount of effort would be to hang around in humans brains and bother them. Make them think they're going nuts, while also slowly pointing them towards an enjoyable future for both host and parasite. But that's just my thoughts.
Have you ever meditated? Privacy doesn't exist. Do what you want, and feel good about. If someone didn't want to do something, it wouldn't happen. Period. Also, camera monitoring is a pale comparison to the real world.
It's pretty tame, honestly. Frequency and spectrums, if you care to Google it, you could literally do the same thing to yourself. Don't like feeling sad? Change your frequency.
Eh, boring. All or nothing.
Dogslothbeaver t1_j7g10wh wrote
You have a 90% chance of being murdered under your scenario. Be careful what you wish for.