Viewing a single comment thread. View all comments

TMax01 t1_irwak8z wrote

What makes you think they are disentangled? What do you even mean by "disentangled"?

Let me try to address what might be the premise of your question, based on some presumptions about why you would ask it. I believe you may be wondering how humans can be observing rather than creating 'morality' as a real feature of the world if morality has no effect on any part of the world besides humans. I would answer, if that is indeed what you meant, that the situation is no different than any other aspect of the world.

Gravity and aerodynamics did not evolve, they are intrinsic physical principles. Birds did not need to consciously observe gravity or aerodynamics to evolve the ability to fly, nor did they decide whether or not to do so. Unlike birds (please ignore the controversial nature of this premise, just take it for granted for the purposes of this explanation, or substitute insects instead if it bothers you too much) humans possess consciousness. This allows us to observe birds, gravity, and aerodynamics and build airplanes which enable us to fly. Standard philosophy (ie normative ethics) expects morality to be like airplanes, consciously constructed based on logical principles. When that approach fails, because morality is an aspect of existence rather than a technological development, many people (both philosophers and laymen) assume morality must be like birds or else it can't exist, and ethics becomes either nothing more than cultural norms or personal preference (or a contrived combination of both: imaginary airplanes.) But morality is not airplanes (real or imagined) neither is it birds; it is aerodynamics. We can (both as individuals and as societies) accept it is real or deny it is physical, and we can use it to rise above the ground and conquer gravity by flying, acting morally, or instead ignore it and act dishonestly and selfishly. But it is real, it is something we observe rather than create, even though, like the principles of aerodynamics, we can reconstruct the principles by carefully considering the causes and results in the world.

I hope that makes sense. Thanks for your time.

3

Krasmaniandevil t1_irwql69 wrote

A few thoughts and clarifications

I do not believe that morality exists independent of sentient existence, or what might be called consciousness. For example, I do not think anything of moral significance occurs on Mars or Pluto or the Sun. If the world were to be obliterated in a nuclear holocaust, there would be nothing left to assess the morality of that action, nor any future actions. Without God, or some other type of observer, the universe is indifferent as to our existence, and the notion of morality would retrospectively focus on a tiny sliver of time that wouldn't be relevant for the remainder of existence.

But sticking with the nuclear example, suppose someone took control over the nuclear arsenal of the United States and threatened to launch the missiles unless you tortured them (think of Joker trying to get Batman to break his rules). One might say torture is inherently wrong, but here it would be necessary to maintain a world of any moral significance. To choose not to torture would be the height of hubris to me because it puts maintaining an individual moral code on a pedestal at the expense of countless others, most of which will have different values but none of which will exist if the deontologist maintains their chastity. I choose this example because I think that stakes change when we're talking about the continued existence of life itself, which I view as one of the foundations of morality at its most basic level.

I do not think humans are the only animals capable of moral reasoning. Rats save rats they know, but not strangers. Dogs react if you give one more treats than the other. Primates adopt orphans that were sired by others, etc. The podcast lists some of these examples, but there are many more (e.g., dolphins protecting humans).

Perhaps you distinguish those examples from morality and view them as instinct, but don't humans have instincts as well? We see it all the time, sometimes in things as simple as reflexively turning our head when we see an attractive prospective mate, or in body language. I think philosophy has a huge blind spot about putting humans on pedestals compared to other life forms.

You might also distinguish modern humans from our ancestors, but do you believe that early humans were not conscious? Is there a discrete moment in time where human evolution progressed sufficiently to trigger moral duties that did not exist moments before? Would that moral code apply to sentient aliens who evolved from different species in radically different biomes? Putting these notions together, morality is path dependent based on adaptive behaviors/intuitions that are species-specific. Although a chimp or gorilla is capable of communicating with humans and otherwise emulating some human behaviors, it would be a mistake to hold them to the same standard as a human who is capable of those same functions simply because chimps are wired differently.

I resolve these issues by placing perpetuating the species as the prime directive of morality, and taking humans off of a pedestal. Humans have a care-intensive, cooperative survival strategy because that's what we're wired for, and as we got more sophisticated we attempted to refine those intuitions into religious/cultural/philosophical beliefs about morality. We see this with practices like cannabilism and polygamy, as well as science-fiction shows (e.g., Star Trek with Klingons, Vulcan, Ferengi, etc., all sentient but each having vastly different moral sensibilities).

Some of those beliefs were not adaptive, such as Christian sects that thought reproduction was a sin. Some of these moral codes work better than others, but if they're not responsive to the circumstances that surround them, it doesn't really matter if they were "correct" from a deontological standpoint, at least in my opinion.

Thanks for your time as well.

1

TMax01 t1_iry0rt5 wrote

>I do not believe that morality exists independent of sentient existence, or what might be called consciousness

It is called consciousness, and you are saying you don't believe morality exists.

>For example, I do not think anything of moral significance occurs on Mars or Pluto or the Sun.

Nothing of moral significance occurs in the absence of moral agents. The events of the physical world are amoral, but the actions of moral agents are either moral or immoral.

>nothing left to assess the morality of that action,

So you're saying that murdering every human in existence wouldn't be immoral? That seems like an odd, and very immoral, position to take.

>a tiny sliver of time that wouldn't be relevant for the remainder of existence.

Much like everything we do. Oh well, everything will end someday and the world does not care, so might as well murder your family and all the witnesses, then you'll have done nothing wrong.

>threatened to launch the missiles unless you tortured them

When someone makes a threat and then kills someone by following through on it, they have killed someone, regardless of the demands they made or whether anyone tried to appease them. It can be a tough thing to accept, but that's just how it works.

>I do not think humans are the only animals capable of moral reasoning.

Humans are the only animals capable of any sort of reasoning. Humans can interpret the behavior of other animals however they like, the animals cannot know or care, they simply act in response to stimuli however their genetic programming causes them to, without thought or remorse.

>don't humans have instincts as well?

The real question is whether humans have anything other than instincts. Your ability to ask the question proves you have self-determination. You don't need to agree with this for it to be true. In fact, it is your ability to disagree which makes it true. Certainly, you can mischaracterize everything that every human has ever done as "instinct", but it seems more like a cross between semantic bullshit and hiding from reality than a reasonable intellectual position.

>I resolve these issues by placing perpetuating the species as the prime directive of morality,

That is the opposite of morality, just as choosing to do so is the opposite of instinct, despite the fact that it could (attempts to) replicate the results of acting on instinct alone would do, if every human did the same.

If only your immoral pretense of morality were actually as satisfying as you wish it would be. Then humans would just be apes, and nobody would have to spend any time considering your opinion about anything. 🙁

1

TR_2016 t1_irzd8kc wrote

"the animals cannot know or care, they simply act in response to stimuli however their genetic programming causes them to, without thought or remorse."

"The real question is whether humans have anything other than instincts. Your ability to ask the question proves you have self-determination."


We are also acting based on our genetic programming and the input from the outside world. Nothing more than AI on a body really, we just don't know our exact code. Our actions are nothing more than the output of a model. Randomness might be involved as well, but that is not free will.

"So you're saying that murdering every human in existence wouldn't be immoral? That seems like an odd, and very immoral, position to take."

It wouldn't be moral or immoral. Assume you are testing out a short trial period of a "game", limited to 3 hours. You could do whatever you want, once the trial runs out, none of your actions matter anymore and you are locked out, game and your save files are erased forever.

Does your actions actually matter? No. Are we really in a different situation than an AI playing this game? The AI is coded to perform certain actions and preserve itself, while its actions can be heavily manipulated depending on input presented by other players. Players are different variations of the same AI.

Lets take up morality. The only basis for morality i can see is by indoctrinating society that it is "bad" to do certain things or harm others, we are reducing the likelihood that someone else will harm us. It can be argued that we are following the self preservation task of our code by creating a "morality".

Lets say however, you have come to the conclusion that even if you harm others, you will still be safe from others inflicting the same harm on you, what incentives are there now to follow morality? There is another incentive. Due to a combination of our code and indoctrination, one might avoid harming others to avoid feeling "bad".

In the end it is not so different than a set of conditional jumps in assembly. Basicly a long set of risk/reward assesments with everyone having a different threshold. Again, since we can't read our brains like a computer code, we don't have the whole formula for a calculation.

It might be impossible for some to arrive at a certain conclusion no matter the inputs presented to them due to their "code", or if it is a critical function or a linear code, it might be executed regardless of which inputs we present in to the calculation.

Each person might come to a different conclusion on a single "moral" question, even if inputs are the same, because their code might be different. Or you could indoctrinate two people on the same/very similar code with different inputs in their development stage and later, they might each come to a different conclusion on the same moral question.

Since we don't know our code, our observation is mostly limited to check which inputs produce different outcomes. There is no objective correct or incorrect outcome.

It is entirely possible that if you could somehow modify Putin's brain like we could modify an AI's code, you could easily make him declare peace or launch nuclear weapons depending on your preference. So where is his free will? How are his actions anything but the output of a complicated model?

2

TMax01 t1_is0ygq2 wrote

>We are also acting based on our genetic programming and the input from the outside world.

That is merely the starting point of our behavior. Certainly, you can maintain your assumption as unfalsifiable, and believe that humans (including yourself) are nothing but biological robots like all other forms of life, with no conscious self-determination, by defining every act of art, poetry, philosophy, science, engineering, hope, or emotion as "genetic programming" or operant conditioning. But in doing so, you are, admittedly or not, proving the opposite, because animals have no need, desire, or capacity to do such a thing.

>So where is his free will?

This is the root of the problem, indeed. The conventional perspective you are regurgitating is the same dead end philosophy has been mired in for thousands of years. But it is a false assumption that self-determination is or requires "free will".

All it would take for Putin to realize he made a mistake and call off his unjustified and illegal invasion of Ukraine would be recognizing that the teleology he used to justify it is both backwards and upside down. According to your worldview, and therefore as long as everyone else continues to agree with and promulgate the conventional philosophy underlying your worldview, this is effectively impossible: Putin believes he based his choice on logic, and so he will continue to see that decision as logical. In my philosophy, it is merely unlikely, because he has as much faith in the conventional philosophy as you do: he believes he is acting reasonably, despite the rather obvious fact that he is not. If everyone around Putin rejected the conventional belief that self-determination is free will, it wouldn't matter if Putin did, he would still be much more likely to act reasonably rather than continue to use false logic to remain convinced he is no different than an AI ape.

Thanks for your time. Hope it helps.

2

Krasmaniandevil t1_is8pwe5 wrote

Stop staw-manning, you can't characterize someone else's position with so much rhetorical flourish so early in your response.

Dirty pool.

1

TMax01 t1_isancbe wrote

Your contentiousness fails to provide a rebuttal to my comment.

1

Krasmaniandevil t1_is8pnxj wrote

You've misunderstood my argument and made multiple logical fallacies.

Morality cannot be compared to gravity. Gravity can be measured. It can predict outcomes. It exists regardless of humanity. You've simply presumed the concept of morality (which apparently only humanity can recognize?) can exist independently of the only species which qualifies as a "moral agent." Your argument begs the question in that it presumes that which it seeks to prove.

1

TMax01 t1_isaln3i wrote

>Morality cannot be compared to gravity.

It's an analogy. Deal with it.

>It can predict outcomes. It exists regardless of humanity.

Hmmm...

>You've simply presumed the concept of morality

I've observed the process of morality. Your assumptions (or my presumptions) about what it is or how it works can be quite inaccurate, and are certainly imprecise, without shedding any doubt on the existence of that process.

>can exist independently of the only species which qualifies as a "moral agent."

Indeed, the presumption that morality would be observed by any other species (not in result but in process) which are moral agents (experience consciousness) is a necessary reflection of what morality is. The fact that we know of no other such species does not preclude their existence, and does not change the nature of morality. You may, if you wish, demand that this uncertainty limit your notions of what morality is, but I am not required to join you.

>Your argument begs the question in that it presumes that which it seeks to prove.

My argument seeks to explain, not to prove. Morality isn't quantitative, as you've mentioned, and cannot be 'proved' in the way you are suggesting, it can only be recognized by other moral agents. The truth is, however, that all moral agents (consciousness) does recognize it, even when they seek to avoid it by claiming blindness to it.

Thanks for your time. Hope it helps.

1

Krasmaniandevil t1_isapv42 wrote

The analogy strikes at the core of the debate: whether morality is objective (like gravity) or subjective (like art).

Morality is a process now? How does that track with the gravity analogy?

I never said morality was quantitative, but you compared it to a phenomenon that could be.

"Recognizing" morality suggests that, like physics, it exists independent from "moral agents."

If we agree that the existence of moral agents is a necessary precondition of the existence of morality, are you saying that there is some universal standard of morality that applied at the dawn of man, remains the same today, but that did not exist until human evolution passed the "moral agent" threshold?

1

TMax01 t1_isb1gxj wrote

>morality is objective (like gravity) or subjective (like art).

Your false dichotomy strikes at the core of the conundrum. Morality is objective like consciousness, undeniable regardless of whether it is quantifiable. It is fashionable to assume and insist that gravity is absolutely objective and art is entirely subjective, but the truth is not that simplistic.

>Morality is a process now? How does that track with the gravity analogy?

It shows your reaaoning to be a matter of assuming a conclusion, namely, that if morality cannot be simplistically quantified then we should presume it doesn't exist. As if works of art stop existing for those who proclaim "that isn't art!"

I could belabor the point further, identifying how gravity is not directly quantifiable, but can only be measured in terms of mass and acceleration. Does this mean gravity is not real? In some ways, it actually does, in some ways, it doesn't.

>I never said morality was quantitative,

Is there some other way of interpreting "gravity can be measured" that I'm not aware of? Perhaps you don't want to admit it, but the premise that morality must be quantitative like gravity in order to exist is clearly the foundation of your contention.

>but you compared it to a phenomenon that could be.

I used an analogy. It is a kind of comparison, but relies on a level of engagement you haven't provided.

>"Recognizing" morality suggests that, like physics, it exists independent from "moral agents."

It does more than suggest that, it declares it directly. It is a necessary presumption that anything we percieve exists independently of our perceptions if it exists at all. Some people believe that they can be amoral intellectual agents without being moral agents, and if they try hard enough they can refuse to recognize morality at all. It is an easy assumption to make, because of the nature of morality (including the ways it differs from time, mass, velocity, momentum, and gravity, though all of these things can be considered useful analogies for understanding what morality is) but it is a mistake nevertheless. To be conscious is to recognize moral truths, despite any difficulty we might have expressing, describing, or comparing them.

>If we agree that the existence of moral agents is a necessary precondition of the existence of morality,

We don't. In fact, you have it backwards, that is the opposite of what I directly said. The existence of moral agents is a necessary precondition for observing the existence of morality, but not for that existence to occur. The existence of morality is a necessary precondition for the existence of moral agents, but recognizing morality is not a precondition for its existence, just as recognizing gravity as gravity is not a precondition for gravity. Gravity is quantifiable even when it isn't quantified.

>are you saying that there is some universal standard of morality

Apart from "morality exists", no standard, universal or otherwise, is necessary for morality to exist.

>but that did not exist until human evolution passed the "moral agent" threshold?

You keep repeating the same error. Morality exists independently of moral agents, but can only be observed by moral agents. Perhaps "time" would have been a better analogy than gravity? Any analogy would suffice if you were interested in understanding it, but no analogy could suffice if you are not.

Thanks for your time. Hope it helps.

1