Viewing a single comment thread. View all comments

Saadiqfhs t1_j4lktnd wrote

What makes a human?

Hey it’s me the resident nerd.

I often read early Inhuman comics and one of the big moral debates is what to do withe alpha primitives. The Alphas are basically cloned netherhals created to serve the inhumans as slaves. But the inhumans debate is that right to enslave them; are they people to?

Then I think of the Star Wars extended universe and Star Trek, and they always ponder a question when is a robot sentient and deserve rights? That is kind of the back drop of George Lucas’s clone wars and the legacy stories from it, the morality of clones and droids shooting each other.

So I want to know; can you justify clone servants even if they are of a lesser “human” species?

Can you argue the humanity of a machine?

2

Perrr333 t1_j4scb07 wrote

I haven't read those comics, but I will say that anyone who reserves moral value strictly to humans is an idiot. Consider some alien species of equal intelligence and similar faculties to feel pain coming to visit us. It would be just as wrong to torture them as to torture humans, other concerns aside (like if there was some reason for torture such as gaining vital information). If some species here on Earth were to develop similar faculties to us, it would be wrong to deem them of lower ethical value just because they do not have the same biology as us. This is wrong for exactly the same reason that racism is wrong. It is not our biology that defines us, but our faculties. I have a problem with the term "humanism" for this reason, though seeing as there aren't currently any comparable species to us yet, it's not really an issue for now. In my opinion this means a good ethical theory must find something other than biology to base its ethical value on. For me the most viable option is sentience, though others look to the experience of pain and pleasure. Theists need to contort themselves around the musings of ancient books written and rewritten over time by many different people of questionable intelligence, motives and sanity, but I imagine they may run into trouble because the various authors didn't have the forethought to consider non-humans similar to humans in intelligence, because sci-fi hadn't been invented.

1

Saadiqfhs t1_j4v4a3h wrote

Can you extend that to a machine? If you grant a machine the feeling of pain, of love and lose, can it become free?

Similarly, can I create a sub human with same intelligence as a gorilla and make it subservient?

2

Perrr333 t1_j4xc94j wrote

There's nothing in principle about machines that must forbid sentience and ethical value. Consider we are able to build a perfect mechanical reproduction of the brain utilising computer parts which are able to interact 3 dimensionally and in exactly the same manner as neurons. Also assume using this replica combined with necessary input sensors and output body movements we were able to create exactly to recreate the electrical activity that occurs within the brain. This artificial brain would therefore be doing exactly the same thing as a human brain, just using a substrate of silicone or another material rather than flesh. The use of flesh has nothing to do with sentience, so this artificial brain would exhibit exactly the same sentience and types of thoughts and feelings as ours. It seems obvious to me that it then should receive exactly the same ethical value as us.

The issue is that there are 86 billion neurons in the human brain, they interact 3 dimensionally, chemicals can flow throughout all of the brain via blood, and failing to give appropriate input sensors and output movements might cause insanity. So we may never be able to build this. Now in principle if we perfectly understood how everything worked and tied together, we could simulate the entire thing without building it using an incredibly powerful computer. But we are nowhere near that sort of understanding so again it's unclear if we ever can.

In sci-fi these real world concerns about feasibility of construction are pushed aside. But also we are often asked to deal with artificial intelligences which are conscious but different from our own, often radically. This is the same case as the inherently racist/speciesist term "sub-human". Here ethical comparison is now problematic, and may differ from case to case. I personally would value any intelligence considered sentient equally; other philosophers talk of "personhood". Importantly, just because a being is more or less intelligent shouldn't grant it more or less ethical value, just as humans with higher IQ are not of greater ethical value.

1

Saadiqfhs t1_j566csf wrote

Do you extend this to animals that show intelligence but not to the degree as humans? Like perhaps instead of sub-human(neanderthals) clones, it’s a Planet of the Apes situation and we found away to make Apes a servant class? Would the ability to complete complex task be enough to justify emancipation?

2

Perrr333 t1_j595vne wrote

Good questions, I think whenever anything starts approaching human level intelligence we need to ask them. I haven't got any answers for you though, as I would have to look into the literature and have more of a think to make up my mind

1

el_miguel42 t1_j5c2kty wrote

Those are all very different questions.

What makes a human? Well homo sapiens, the primate species. if you have the requisite genetic structure, DNA and chromosomes etc... Then you're a human.

A cloned neanderthal would not be a human... It would be a neanderthal.

You mention humanity... Now that is far more subjective, and more to do with the difference between how you socially treat a human vs an animal. Hence if you were treating someone inhumanely, you would be treating them in a manner equivalent to (or less than) an animal. This of course exists because most humans elevate their importance above all other animal and plant life.

Sentience is a very tricky thing to define, and is normally defined as awareness and the ability to experience feelings or sensations.

Of course this definition came about back in the 1600s, feelings and sensations are just a bunch of electrical impulses interpreted by your brain in order to try and get the human to act in a specific manner because at some point historically, acting in said manner would have increased the odds of survival. So does it apply to an android? Depends whether you insist on keeping the words "feeling" and "sensations" in the definition...

I personally wouldnt justify keeping a gorilla as a slave (assuming it wouldnt just tear my arms off) so I certainly wouldnt justify neanderthal slaves.

This can be applied to the modern day. Do you think that the great apes deserve "right to life". If so, what other animals would you extend that to?

1