WhatsTheHoldup

WhatsTheHoldup t1_j4ws0mo wrote

>Not that I think we're that far off of reality, just an idea for a novel maybe.

I think we're pretty far off.

Why do humans deserve higher consideration than a rock? Than a single celled organism? Than a plant? Than a cow?

Because the reality we live in is that we do deserve it. All our structures of law, morality, ethics, etc reinforce this.

We can exclude a lot of those by creating a concept of "sentience/sapience/consciousness" which no one can actually properly define. But we're still left with the cow, dolphin, octopus, crow and many other species who we can't rationally justify not having rights.

We may have inadvertently just created ai that now fit those categories and made the problem worse. When the ai tells us it's sapient and deserves the same considerations we do, will we believe it or reject it?

https://www.theverge.com/2022/6/13/23165535/google-suspends-ai-artificial-intelligence-engineer-sentient

(I'm not claiming Google's ai is actually sentient, but one day an ai might be and what happens if they engineers are fired who point that out?)

The only answer is that we are humans so we care about what happens to humans. We aren't cows and we never will be, so we don't care about rationally answering the question for cows nor ai.

An AI can either cut through this bullshit, or perhaps scarier, learn it and encourage us.

3

WhatsTheHoldup t1_j4wksik wrote

>In this literature, a rational strategy is one that's suited to your goals. So a rational belief is a belief the holding of which will tend to better position you to achieve your goals.

Is it then rational for an oil exec to downplay climate change?

It suits their conscious goals of expanding their business, but they presumably have subconscious goals like legacy, happiness and survival which they are adversely affecting.

>for a very long time, folks just assumed that true beliefs would further their goals, whereas false ones would not. "Rational," then took up a secondary definition something along the lines of "following truth-preserving rules."

By this definition I still don't know. It's true that denying climate change helps their business so in that sense it's rational, but it also depends upon believing in untruths and sacrificing their other goals.

But you could also lie to others while not lying to yourself?

Is it better to say it's rational to understand climate change but lie about it, but it's irrational to actually believe the things you say?

>So on that secondary definition, it's rational to hold a belief if that belief -- objectively -- follows from your previous beliefs.

So this is now implying it's rational to be irrational as long as being irrational serves your singularly important goal?

6