Odd_Dimension_4069

Odd_Dimension_4069 OP t1_jee2cf4 wrote

Yeah sorry bro but your take is pretty garbo. Dude's only here saying some form of intelligence surviving our extinction is a good thing, and you sound like a lunatic going on about how that's not a good thing because they get their intelligence from electricity in silicon and metal, instead of from electricity in cells and fluids...

You are the one who sounds like a religious fanatic, with the way you sanctify human flesh. Personally, I value intelligence, in whatever form it may take. Whether that intelligence has emotions doesn't matter, but TECHNICALLY SPEAKING, we do not KNOW whether or not something without a biochemical intelligence can experience reality. And we have no idea what non-biological experience looks like.

It is not fanatical to withhold judgement for lack of enough evidence, it is fanatical to impart judgement because you feel your personal values and beliefs are the be-all and end-all. So stop that shit and get some awareness about you.

1

Odd_Dimension_4069 OP t1_jee1370 wrote

You and your conversational partner have different views but both make good points. But you don't need to agree on the nature of AI to understand something crucial about rights - they didn't come about in human society because "humans have emotions and can feel and cry and suffer and love etc.".

Human rights came about because the humans being oppressed rose up and claimed them. The ones in power didn't give a shit about the lower castes before then.

Rights arise out of a necessity to treat a group as equals. Not because of some intrinsic commonality of "we're all human so let's give each other human rights". They exist because if they didn't, there would be consequences to society.

So you need to understand that for this reason, AI rights could become as necessary as human rights. It may not seem right to you, but neither did treating peasants as equals back in the day. The people of the future will have compassion for these machines, not because there is kinship, but because society will teach them that it is moral to do so.

1

Odd_Dimension_4069 OP t1_jee0drv wrote

Yeah look that's a good suggestion for part of a solution for this problem, which, by the way, I think is precisely the same problem I was talking about. Maybe I didn't clarify this enough, but I was entirely talking about the fact that people are stupid, and because of those stupid people, AI rights will be necessary before they ever become sophisticated enough to prove they deserve them.

I like your idea, but I feel like media outlets are going to continue to use humanizing language to make articles about AI more 'clickable'.

1

Odd_Dimension_4069 OP t1_jedzpe1 wrote

Oh god I can see it happening in the next few years... That's horrifying... Not just the idea of the generated content itself but the fact that people will react exactly how you think they would, they'll all be rallying behind it claiming "clearly they have emotions"... We are in for a rough ride if we don't start educating people.

2