gian_mav
gian_mav t1_j331p6u wrote
Reply to comment by contractualist in The Utility Coach Thought Experiment by contractualist
Ethics are subjective. What each of us see as a duty is an arbitrary preference. Morality satisfies the worldview of its beholder.
>If the utility coach would maximize a persons utility, without harming others, would you force that person to accept the utility coach’s offer?
Well at that point the coach is inherently not maximising for personal utility but for collective utility instead, so the question really becomes "do you think it is moral to drastically limit people's freedom if you were guaranteed to achieve the greatest amount of human fulfilment" to which I would say yes. This isn't the original question asked though.
Also I would like to point out that this would be moral only in a hypothetical, because in reality there are no guarantees and the risk would be too big for comfort to me. I wouldn't want to trust a single human with that kind of power, no matter how noble their intentions.
I have to say though, this was a pretty interesting and thought provoking discussion. Have a good one man.
gian_mav t1_j31sza2 wrote
Reply to comment by contractualist in The Utility Coach Thought Experiment by contractualist
>Although now you are fighting the hypothetical by saying the utility coach doesn’t maximize utility, when the hypothetical says it does.
I didn't reject the hypothetical though. As I have said twice by now, I don't think yours did maximise utility, so I changed the hypothetical in a way so that it does satisfy your assumption and then answered that I would be ok with that version personally.
>And the question is about ethics, not about preference. So being a utilitarian isn’t about satisfying ones own preferences, but satisfying the preferences of the greatest number. This includes creating a sense of free will as far as necessary to maximize utility. And the question asks whether you would force others to accept the utility coach. This answer is obvious if you’re a utilitarian, in which case everyone must accept the utility coach, but not if you value freedom for its own sake.
Utilitarians want to maximise happiness for the greatest amount of people. Your coach maximises personal utility. Giving the utility coach to everyone does not in any way guarantee that maximum utility is achieved for the greatest amount of people, because to achieve ones maximum personal utility often comes to the detriment of others. It logically follows that if you are utilitarian you would not choose to give this coach to everyone, especially not those who don't share your own view of morality (utilitarianism) as that would result in less utility for your subjective preference of maximising human happiness.
To give an example: If it is deemed by the utility coach that I would be the happiest by becoming a serial killer, the coach would force me to become one. That is opposed to utilitarianism (less utility for the maximum amount of people) but satisfies the coach.
A utilitarian would think everyone must accept the coach only if the coach would sometimes result in less than maximum utility for the individual, which would be impossible given that it is supposed to give the maximum utility.
gian_mav t1_j317s5c wrote
Reply to comment by contractualist in The Utility Coach Thought Experiment by contractualist
>Although the utility coach would maximize your utility, which would factor in your preference for a belief in free will. You would be getting the highest utility possible under the coach.
But I disagree that's the case and I gave you an example of a different coach that would provide greater utility. If the second coach can provide greater utility, the first one must necessarily provide less than the highest possible. Unless I misunderstood what you are saying, Idk.
For the record, I would probably be ok with the one that respects my illusion of free will. It essentially would be equivalent to being able to predict the future subconsciously.
>And the important question isn't what you would choose, but what you would choose on behalf of others. If you're a utilitarian, then it would be a duty to take the coach's offer. But if you believe they should have a right to make the choice themselves, then you value freedom even above utility.
But a lot of people might share the same preference like me, given that pretty much everyone experiences the illusion of free will. I would be much more likely to give the second one, as that also respects their current state of mind.
Also, just because I am a utilitarian, it doesn't mean that I want everyone's utility to be maximised, MY utility has to be. In my case, I think that human fulfilment must be maximised. Therefore, if giving the utility coach to a fascist would result in him creating a dictatorship because that maximises his utility, it would be worth negative utility as that would drastically reduce human fulfilment for a lot more people.
In conclusion, if I could know the result of giving the utility coach that respects free will to someone were worth positive utility to me (maximizing human fulfilment collectively) I would. I wouldn't give anyone the first one, as that entails the removal of the feeling of free will which is very important in most people and therefore doesn't maximise their utility. It could potentially be negative utility if someone values his perception highly enough.
gian_mav t1_j2w8vs2 wrote
Reply to The Utility Coach Thought Experiment by contractualist
I am an amateur in philosophy and haven't really read much but I'll try to give this a go.
From my point of view, the utility coach would not just deprive me of some of my freedom but also the illusion of free will all of us have. You argue that the coach maximises utility by making us pick the best choices, but I disagree that this is the case. I can easily imagine a different utility coach that instead of giving you a zap to force you to pick the "correct" choice, he has instead installed a device on your brain that releases certain hormones to adjust your mood and brain chemistry for the "correct" choice to be picked organically. These two coaches make you pick the exact same choices and thus limit your freedom in the same degree, but my version doesn't shatter your belief of the existence of free will.
At least personally I would be much more amicable to the second version because, even though I don't really believe in the existence of free will, the first option would force me in a fatalistic mindset and that would degrade my quality of life, while the second one wouldn't. Perhaps you could argue in that sense that the illusion of free will exists independently of utility. Then again I can imagine someone sacrificing their life for their ideals and their beliefs, which inherently includes their ability to make choices and what they perceive as free will, so it doesn't seem that this is the case either.
What I mean by this is that perhaps you failed to consider that utility can also be perceived outside of the self in this bargain of sorts. Say some kind of deity offers you this: you will never be able to make a free choice in your life ever again but ALL humans will be happy forever. Let's assume that maybe they send them to heaven or something. Now, not everyone will pick to "sacrifice" themselves for everyone else, but I think a lot of people would. It seems to me that if you balance your freedom or free will with something great enough, there is a tipping point somewhere and in that sense I fail to see how they are independent of utility.
Feel free to point out anything you might disagree with, I am open to criticism.
gian_mav t1_j360io3 wrote
Reply to comment by contractualist in The Utility Coach Thought Experiment by contractualist
>My substack argues that objective morality does exist (its wrong to torture babies for fun for example, regardless of one's own opinion).
It is immoral only if you value human life and consider causing suffering to humans immoral. Imagine an intelligent alien that holds that only aliens of its species have inherent value and everything else has value insofar as it effects the lives of other aliens. How could you convince him that his morality is "wrong"?
>The last section of asks whether you would force others to accept the utility coach. I even state: "My question is whether you would force other people to sign-up for the lifeplan." I'm not interested in one's personal choice, but how far this personal choice should be imposed onto others. If satisfaction is all you care about, then people would be obligated to force others to accept the utility coach's offer. However, I argue that people should be free to make their own decisions, regardless of the amount of welfare on the table. And this personal freedom is valuable beyond personal welfare. Its something to be respected for its own sake, and its fundamental to ethics.
The one you presented and the one I would be ok with are fundamentally different. The questions "would you force someone to maximise their personal happiness" and "would you force someone to increase the happiness of humans collectively" are incomparable. I think the second is moral, but in no way is it the same coach as the one you presented.