Submitted by BernardJOrtcutt t3_10df9ua in philosophy
Perrr333 t1_j4scb07 wrote
Reply to comment by Saadiqfhs in /r/philosophy Open Discussion Thread | January 16, 2023 by BernardJOrtcutt
I haven't read those comics, but I will say that anyone who reserves moral value strictly to humans is an idiot. Consider some alien species of equal intelligence and similar faculties to feel pain coming to visit us. It would be just as wrong to torture them as to torture humans, other concerns aside (like if there was some reason for torture such as gaining vital information). If some species here on Earth were to develop similar faculties to us, it would be wrong to deem them of lower ethical value just because they do not have the same biology as us. This is wrong for exactly the same reason that racism is wrong. It is not our biology that defines us, but our faculties. I have a problem with the term "humanism" for this reason, though seeing as there aren't currently any comparable species to us yet, it's not really an issue for now. In my opinion this means a good ethical theory must find something other than biology to base its ethical value on. For me the most viable option is sentience, though others look to the experience of pain and pleasure. Theists need to contort themselves around the musings of ancient books written and rewritten over time by many different people of questionable intelligence, motives and sanity, but I imagine they may run into trouble because the various authors didn't have the forethought to consider non-humans similar to humans in intelligence, because sci-fi hadn't been invented.
Saadiqfhs t1_j4v4a3h wrote
Can you extend that to a machine? If you grant a machine the feeling of pain, of love and lose, can it become free?
Similarly, can I create a sub human with same intelligence as a gorilla and make it subservient?
Perrr333 t1_j4xc94j wrote
There's nothing in principle about machines that must forbid sentience and ethical value. Consider we are able to build a perfect mechanical reproduction of the brain utilising computer parts which are able to interact 3 dimensionally and in exactly the same manner as neurons. Also assume using this replica combined with necessary input sensors and output body movements we were able to create exactly to recreate the electrical activity that occurs within the brain. This artificial brain would therefore be doing exactly the same thing as a human brain, just using a substrate of silicone or another material rather than flesh. The use of flesh has nothing to do with sentience, so this artificial brain would exhibit exactly the same sentience and types of thoughts and feelings as ours. It seems obvious to me that it then should receive exactly the same ethical value as us.
The issue is that there are 86 billion neurons in the human brain, they interact 3 dimensionally, chemicals can flow throughout all of the brain via blood, and failing to give appropriate input sensors and output movements might cause insanity. So we may never be able to build this. Now in principle if we perfectly understood how everything worked and tied together, we could simulate the entire thing without building it using an incredibly powerful computer. But we are nowhere near that sort of understanding so again it's unclear if we ever can.
In sci-fi these real world concerns about feasibility of construction are pushed aside. But also we are often asked to deal with artificial intelligences which are conscious but different from our own, often radically. This is the same case as the inherently racist/speciesist term "sub-human". Here ethical comparison is now problematic, and may differ from case to case. I personally would value any intelligence considered sentient equally; other philosophers talk of "personhood". Importantly, just because a being is more or less intelligent shouldn't grant it more or less ethical value, just as humans with higher IQ are not of greater ethical value.
Saadiqfhs t1_j566csf wrote
Do you extend this to animals that show intelligence but not to the degree as humans? Like perhaps instead of sub-human(neanderthals) clones, it’s a Planet of the Apes situation and we found away to make Apes a servant class? Would the ability to complete complex task be enough to justify emancipation?
Perrr333 t1_j595vne wrote
Good questions, I think whenever anything starts approaching human level intelligence we need to ask them. I haven't got any answers for you though, as I would have to look into the literature and have more of a think to make up my mind
Viewing a single comment thread. View all comments