Submitted by [deleted] t3_yety91 in singularity
DanielNoWrite t1_itzw500 wrote
There's a great deal of concern.
But to address your implied argument, there's no reason to believe an AI will necessarily have any of the motivations that we associate with normal life, much less human beings.
intelligence =/= ego, or even survival instinct
There's no real reason to think it would be resentful, or capable of growing bored, or sadistic, or even care if it was turned off. Those are traits baked into living things by evolution.
That said 1) we can't really be sure that's true and 2) it might still easily cause unimaginable destruction incidentally
And so yes, while there's a lot of hype on this subreddit in particular, there's actually a great deal of concern about it more broadly.
That concern isn't having much impact right now because AI is making incredible advances possible, and it's really hard to regulate something both so poorly understood and profitable, and AGI is still firmly in the realm of science fiction as far as most of the population is concerned.
[deleted] OP t1_itzziad wrote
[deleted]
DanielNoWrite t1_iu00iqv wrote
Again, this isn't really accurate.
It's not necessarily true that it'll be trained on indiscriminate data from the internet, and it's not necessarily true that it would simply adopt human behaviors as its own even if it were.
[deleted] OP t1_iu03e1b wrote
[deleted]
Viewing a single comment thread. View all comments