Viewing a single comment thread. View all comments

turnip_burrito t1_ivta5az wrote

I'm sorry, but I think we are operating on different definitions of "conscious", which as we know is a common problem since it's a very liberally used word. I think this is causing me to have trouble following. If you would please kindly define it for me, then I think I will understand your statements.

What is the definition of "conscious" in your writing? And in a similar vein, what measurements or observations (if any) could be done to show something "has" it? I think this would clarify a lot for me.

1

marvinthedog t1_ivvbtnr wrote

Ok, I had to look up the ambiguity around consciousness because allthough I had heard of it I didn´t know a lot about it: https://en.wikipedia.org/wiki/Hard_problem_of_consciousness

I read the first half and found a lot of the concepts a little confusing. I am pretty sure I have read this article before even though it was a long time ago.

I guess I am reffering to the actual raw conscious experience, you know the thing that stands out from all other existing things in an infinitely profound way, the thing that could be argued to be the only thing that holds any real value or disvalue in the universe.

So if I get the article right I guess that´s the hard problem of consciousness and not the easy problem. So I don´t mean self consciousness, awareness, the state of being awake, and so on. I mean the actual raw conscious experience. To quote Thomas Nigel; "the feeling of what it is like to be something".

I don´t think any truly objective measures could ever be done to test if something is conscious (has this raw conscious experience). But I do think high confidence estimates could be done in some or many situations by for instance looking at the internal mechanics and behaviours of systems and comparing them to other systems that we know are conscious.

I would be happy to clarify further if you have further questions.

​

So if we go back to my though experiment: The way I described consciousness with words previously is an output behaviour from a human (me). I think we can both agree that this specific output behaviour is a direct causition of me being conscious and not just a random correlation with me being conscious. It´s not like me writing those very specific word sequences previously has nothing to do with the fact that I am conscious and that that correlation just happened by random chance, right?

So, if a replica outputs a similar sequence of words it´s extremely unlikely that that very specific output behaviour just happened by random chance and has nothing to do with consciousness what so ever. Don´t you agree?

1

turnip_burrito t1_ivvpv6p wrote

Thanks for the clarification. I suspected that is what you intended by the term, but was not sure. My view probably most reflects Chalmers'. I agree with everything you've written except for these last two paragraphs:

>So if we go back to my though experiment: The way I described consciousness with words previously is an output behaviour from a human (me). I think we can both agree that this specific output behaviour is a direct causition of me being conscious and not just a random correlation with me being conscious. It´s not like me writing those very specific word sequences previously has nothing to do with the fact that I am conscious and that that correlation just happened by random chance, right?

I disagree with this. I agree that it is not a random correlation, but I would say your output behavior as described by an external observer does not require any information of your conscious experience. I would say that for any external observer, the physical, functional processes that occur in your brain are enough description to know what behavioral measurements I will have of you in the future (except for quantum effects), and that your consciousness is the qualia of those brain processes. There is not a random correlation or consciousness causing neural activity, but instead a direct, non-random correlation between externally measurable brain states and your consciousness. What this means specifically about who causes what is a little flexible, but I would speculate this:

  1. Physics is is inherently a description of how parts of existence interact with other parts. Consciousness is some subset of existence, at the most basic level of existence. If this is the case, conscious experience and physics are the 2, and only, fundamental parts of existence. The internal physics of a thing is directly correlated one to one with the consciousness of the thing, but we cannot know the correlation. (Also "thing" is a fuzzy term here)

  2. As a consequence of (1), physics completely determines output behavior. Consciousness has no useful explanatory power for anything measurable or observable in the external world, but the reverse is also (presently) true: the internal physics of an object cannot be traced by humans to the kind of conscious experience it has, because the correlation cannot be described or known by any method we have access to.

>So, if a replica outputs a similar sequence of words it´s extremely unlikely that that very specific output behaviour just happened by random chance

Yes. But it's because of the physics only, and consciousness is irrelevant.

>and has nothing to do with consciousness what so ever. Don´t you agree?

Consciousness and behavior have a connection, but not one in which consciousness is necessary for any behavior. They are both instead (I would suppose) concurrent. (See speculation in point 1).

Summary: I would say the unconscious (or conscious) machine has a 100% probability of behaving exactly like the conscious human it is modeled after (except for chaos and quantum effects), so we are unable to tell the difference between a conscious and unconscious entity from external observation of its behavior.

1

marvinthedog t1_ivzoiuc wrote

I have carefully read through your post atleast 5 times throughout the day. Most of your points are still quite confusing to me so it´s difficult for me to adress it all, even though it´s interresting.

​

It almost seems like you are saying that it´s impossible to even make probabilistic estimates about consciousness. But what about other humans then, how do you now they are conscious? If it stands between a replica of you on a silicon substrate and another human, which one of them would you be able to give the most confident estimate about wether they were conscious or not? You know you are conscious and we could certainly make a strong case that the one that is the most identical to you with regards to inner physical functionality is your replica so therefore it seems like you would be able to give the most confident consciousness estimate to your replica and not the other human. Do you agree?

1

turnip_burrito t1_iw09nxs wrote

I apologize if my wording is unclear. It's also not a very commonly talked about idea, so constructing the vocabulary to discuss it was challenging for me.

>It almost seems like you are saying that it´s impossible to even make probabilistic estimates about consciousness.

Yes, presently impossible except for making probabilistic statements about other humans. I don't know they are conscious for sure, but I think they probably are conscious. This is because I know this: I am conscious and I am biologically human. This is the only sample I have, so rating probability of consciousness, I would put other human brains at the top of the list (most likely conscious), animal brains next, and everything else in descending probability of consciousness. Something like a frozen rock, I would guess to not be conscious.The further something gets from biologically human, the less certain I am that it is conscious.

>If it stands between a replica of you on a silicon substrate and another human, which one of them would you be able to give the most confident estimate about wether they were conscious or not? You know you are conscious and we could certainly make a strong case that the one that is the most identical to you with regards to inner physical functionality is your replica so therefore it seems like you would be able to give the most confident consciousness estimate to your replica and not the other human. Do you agree?

No, I do not agree with this. I think the human is more likely to be conscious because it is made out of the same stuff as me. The robot acts like me, but it's a different substrate of system. Whether the robot is conscious or not is unknown to me. I don't currently see any reason to believe a robot that acts like me mist be conscious, even if it says it is.

The other human is most similar to me in actual physics, even if they are a totally different person. Same molecules, structures, activation patterns, etc. The electric fields and quantum structures are similar. The robot brain could work in some bizzare totally alien way in order to pretend to act like me (like a set of GPUS in a basement) and I have no clue if the physical structure of its "brain" actually correlates with a unified conscious experience like mine.

This is also why "mind uploading" to a different substrate like a computer chip, even if the technology existed, gives me pause. The chip may very well also be conscious, but I don't think I would be able to tell from its behavior or any physical measurements. If I had to kill myself to upload, I'd risk losing my consciousness to produce a chip that might not feel anything. That'd be a waste.

1

marvinthedog t1_iw1qs77 wrote

It seems you might have missunderstood me when you said you agree to what I proposed in my thought experiment, because what I proposed was actually that your replica provides a lot stronger evidence for consciousness than the other human. You know you are conscious and the one who has the most functionally similar physical neural architecture to you is your replica.

​

When all the three of you describes consciousness in your own words the neural processes in your head is a lot more similar to your replicas neural processes than the other humans neural processes. For instance you and your replica might be thinking mainly in pictures and be wizards in abstract math while the other human might be thinking mainly in words and be exceptionally good at remembering facts or whatnot. Also your written down description of consciousness will be a lot closer to you replicas than the other human. So the fact that you seem to think that the human provides stronger evidence than the replica is very perplexing to me.

​

And you seem to think even some animals provide stronger evidence than your replica which is even way more perplexing. Animals cannot even communicate what conscousness is (atleast not in a language we can understand) and their neural architecture is way way more different than your replicas.

1

turnip_burrito t1_iw1s07i wrote

Yes, I misunderstood when I said I agreed. I just updated (apologies). I disagree actually. I just edited my post to reflect that.

1

turnip_burrito t1_iw1scim wrote

No, the other humans and animals have more similarity to me than my silicon replica on a molecular level. They are made of organic compounds, neurons, glial cells, etc. Their internal chemistry is the same as mine. So I'm more confident in their consciousness. Other humans mostly only differ from me in concentration of compounds and specific network connections, but are otherwise the same.

The replica could run on GPUs and be made of silicon. It could also be a series of gears and pulleys. Or some absurd series of jello cups and iron marbles dropped and retrieved over and over to perform computations, which are then read out to a screen as English. That's not a similar molecular makeup to me at all. I don't know if quantum correlations or temporal correlations or whatever is necessary for consciousness are preserved in this new substrate.

Just because we look at the replica and say "it's computing using primarily visial information like me" isn't helpful to show consciousness, because we have no evidence of silicon, pulleys, or planet sized warehouses of jello being conscious. It's like comparing a bat and a bee and saying they both share the same diet because they both fly. A robot me and real me don't necessrily share the same conscious experience just because our behavior is the same. We could, but how would we know? At least humans are made of basically the same stuff.

As I said, I don't believe consciousness affects behavior. I don't believe consciousness affects a robot's ability to mimic me. I am considering what it is, not what it appears to be. I think physics probably is the only thing that determines behavior, and it leaves no room for any unphysical thing to determine behavior. In other words, a mimic robot could act like me and still be unconscious because it is simply just built to do that and is following physics. It does what it is constructed to do, conscious or not, because the particles that make it up obey physics.

I also think humans do only what their physics makes them do, by the way. But we (probably we) also happen to be conscious. So we experience as we move and think, but in a more passive passenger type way than we perceive or want to admit.

1

marvinthedog t1_iw8b0sv wrote

I have read your previous response which you updated and your last response which you also updated. At this point I don´t think we are going to get a lot further. This discussion really helped me clarify my own mental models about consciousness so that was very usefull. Thanks for an interesting discussion!

3