Submitted by BernardJOrtcutt t3_10df9ua in philosophy
Perrr333 t1_j4xc94j wrote
Reply to comment by Saadiqfhs in /r/philosophy Open Discussion Thread | January 16, 2023 by BernardJOrtcutt
There's nothing in principle about machines that must forbid sentience and ethical value. Consider we are able to build a perfect mechanical reproduction of the brain utilising computer parts which are able to interact 3 dimensionally and in exactly the same manner as neurons. Also assume using this replica combined with necessary input sensors and output body movements we were able to create exactly to recreate the electrical activity that occurs within the brain. This artificial brain would therefore be doing exactly the same thing as a human brain, just using a substrate of silicone or another material rather than flesh. The use of flesh has nothing to do with sentience, so this artificial brain would exhibit exactly the same sentience and types of thoughts and feelings as ours. It seems obvious to me that it then should receive exactly the same ethical value as us.
The issue is that there are 86 billion neurons in the human brain, they interact 3 dimensionally, chemicals can flow throughout all of the brain via blood, and failing to give appropriate input sensors and output movements might cause insanity. So we may never be able to build this. Now in principle if we perfectly understood how everything worked and tied together, we could simulate the entire thing without building it using an incredibly powerful computer. But we are nowhere near that sort of understanding so again it's unclear if we ever can.
In sci-fi these real world concerns about feasibility of construction are pushed aside. But also we are often asked to deal with artificial intelligences which are conscious but different from our own, often radically. This is the same case as the inherently racist/speciesist term "sub-human". Here ethical comparison is now problematic, and may differ from case to case. I personally would value any intelligence considered sentient equally; other philosophers talk of "personhood". Importantly, just because a being is more or less intelligent shouldn't grant it more or less ethical value, just as humans with higher IQ are not of greater ethical value.
Saadiqfhs t1_j566csf wrote
Do you extend this to animals that show intelligence but not to the degree as humans? Like perhaps instead of sub-human(neanderthals) clones, it’s a Planet of the Apes situation and we found away to make Apes a servant class? Would the ability to complete complex task be enough to justify emancipation?
Perrr333 t1_j595vne wrote
Good questions, I think whenever anything starts approaching human level intelligence we need to ask them. I haven't got any answers for you though, as I would have to look into the literature and have more of a think to make up my mind
Viewing a single comment thread. View all comments