Viewing a single comment thread. View all comments

[deleted] t1_ixz5txq wrote

You should look up "The alignment problem".

−1

HeinrichTheWolf_17 t1_ixz720g wrote

I worry greatly about the alignment problem, I worry that human beings will create an existential crisis in order to stay in power.

The sooner humans aren’t running the planet the better.

5

sticky_symbols t1_iy2norl wrote

I mean, sure, if you're okay with a world in which everything is turned into that AIs favorite thing. That sounds like the end of every good possibility to me.

1

[deleted] t1_ixz7cjw wrote

You are dangerous.

−5

HeinrichTheWolf_17 t1_ixza3re wrote

Thanks for that, I needed a good laugh today.

Take a minute to look around the world and see what humanity has done, our governments and rich overlords treat their people like expendable cattle being forced to live paycheck to paycheck, Amazon workers can barely take a piss, police harass the homeless just for being poor, powerful old men take swaths of young men away from their families to go die or get crippled a pointless war (look no further than Putin), the environment is crumbling at an accelerating rate and the U.S., Chinese and Russian governments are all adamant about using coal to save on costs at the expense of causing another mass extinction not just of animal life but of billions of humans living in hotter regions, people’s individual rights over their own bodies are being revoked at the hands of men with religious convictions, it’s all about controlling other people.

Agree or not with Q’s antics aboard the Enterprise, he was right about humanity being a savage child race. Humanity can surely do better, but being a reactionary isn’t how a species gets there. AGI is coming and the best course of action is to merge with it. I believe shedding our biological selfish tendencies will be far more beneficial than the world we have now.

4