Viewing a single comment thread. View all comments

[deleted] t1_ivpj1jf wrote

Thanks, awesome post.

I am very skeptical our intuition when it comes to the mind. We have gottem so much wrong, I would be surprised if it really turns out that there is something "special" going on on the brain to justify this difference

1

Nameless1995 t1_ivpm198 wrote

I think as long as we are not purely duplicating the brain, there would always be something different (by definition of not duplicating). The question becomes then the relevancy of difference. I think there is some plausibility to the idea that some "formal" elements of the brain associated with cognition can be simulated in machines, but would that be "sufficient" for "understanding"? This question is partly hinging on semantics. We can choose to define understanding in a way such that it's fully a manner of achieving some high level formal functional capabilities (abstracting away from the details of the concrete "matter" that realizes the functions). There is a good case to be made that perhaps it's better to think of mental states in terms of higher level functional roles than "qualitative feels" (which is not to say there aren't qualitative feels, but that they need not be treated as "essential" to mental states -- the roles of which may as well be realized in analogous fashion without the same feels or any feels). If we take a similar stance that the point of having or lacking phenomenal feels (and phenomenal intentionality) becomes moot because all that would matter for understanding would be a more abstracted level of formal functionalities (which may as well be computational).

If on the other hand, we decide to treat "phenomenal feels" (and "phenomenal intentionality") as "essential" to understanding (by definition -- again a semantics issue), then I think it's right to doubt whether any arbitrary realizations of some higher level abstracted (abstracted away from phenomenal characters) behavior forms would necessarily lead to having certain phenomenal feels.

Personally, I don't think it's too meaningful to focus on "phenomenal feels' for understanding. If I say "I understand 1+1=2" and try to reflect on what it means for me to understand that, phenomenality of an experience seems to contribute very little if anything -- beyond serving potentially as a "symbol" marking my understanding (a symbol that is represented by me feeling in a certain way, an alternatively non-phenomenal "symbols" may have been used as well) -- but that "feeling" isn't true understanding because it's just a feeling. Personally, then I find the best way to characterize my understanding by grounding it in my functional capabilities to describe and talk about 1+1=2, talk about number theories, do arithmetic, --- it then boils down to possession of "skills" (which becomes a matter of degree).

It may be possible that biological materials has something "special" to constitute phenomenality infused understanding, but these are hard to make out given the problem of even determining public indicators for phenomenality.

1

[deleted] t1_ivppbkm wrote

I love philosophy but I admit I am very out of my element here haha. Never bothered with this subject.

From my naive understanding the "mind" (which I already think is a bit of an artbitrary definition without a clear limit/definition) is composed by several elements, say X Y and Z, each one with their own sub elements. As you say unless we build an exact copy we are not gonna have all elements (or we might even have a partial element, say understanding, without all the sub elements that compose it)

I think, for example, that whatever elements compose moral relevance is obviously lacking in modern day IA. That is not controversial. So apart from this I find very uninteresting to try to figure out if a machine can "understand" exactly like a human or not.

So I think as long as we stick with a very precise language it can talk about it in a more meaningful way.

1

waffles2go2 t1_ivq5apy wrote

The "special" part is what we can't figure out. It's not any math that we are close to solving and I really don't think we're even asking the right questions.

1

waffles2go2 t1_ivq412x wrote

Millions of years of evolution yet we are going to figure it out with some vaguely understood maths that currently can solve some basic problems or produce bad artwork...

So pretty special....

1

[deleted] t1_ivq6ogt wrote

I am not claiming we are about to solve it, especially not in this field. I am claiming tho that our intuitions have deceived us about certain concepts (see the entire personal identity debate) and is very easy to think that we are "more" than we really are.

And we have some evidence of that: For example about the personal identity debate we need a certain intuition about our identity thats unifying across our life, even if it turns out to be a type of fantasy thats simply constructed this way because of its utility.

So I dont doubt that the same process is going on in our minds with concepts like consciousness and understanding

1