Submitted by seethehappymoron t3_11d0voy in philosophy
baileyroche t1_ja6w62y wrote
Reply to comment by Yung-Split in AI cannot achieve consciousness without a body. by seethehappymoron
Search “Urbach-Wiethe disease.”
ErisWheel t1_ja851wd wrote
>Urbach-Wiethe disease.
You're misunderstanding the disease that you're referencing. The limbic system is a complex neurological system involving multiple regions of the brain working in concert to perform a variety of complex tasks including essential hormonal regulation for things like temperature and metabolism and modulation of fundamental drives like hunger and thirst, emotional regulation and memory formation and storage. It includes the hypothalamus and thalamus, hippocampus and amygdala. Total absence of the limbic system would be incompatible with life.
Urbach-Wiethe patients often show varying levels of calcification in the amygdala, which leads to a greater or lesser degree of corresponding cognitive impairment and "fearlessness" that is otherwise atypical in a person who does not have that kind of neurological damage. The limbic system is not "absent" in these patients. Rather, a portion of it is damaged and the subsequent function of that portion is impaired to some extent.
baileyroche t1_ja8kaqt wrote
Ok fair. It is not the entire limbic system that is gone in those patients.
ErisWheel t1_ja99svb wrote
Yeah, sorry if it seemed nit-picky, but I think these are important distinctions when we're talking about where consciousness comes from or the presence of what disparate elements might/might not be necessary conditions for it. Missing the entire limbic system and still having consciousness is almost certainly impossible without some sort of supernatural explanation of the later.
Similarly, with locked-in syndrome, I think there's some argument there about whether we really would know if those patients were conscious in the absence of some sort of external indicator. What does "consciousness" entail, and is it the same as "response to stimuli"? If they really can't "feel, speak or interact with the world" in any way, what is it exactly that serves as independent confirmation that they are actually conscious?
It's an interesting quandary when it comes to AI. I think this professor's argument falls pretty flat, at least the short summary of it that's being offered. He's saying things like "all information is equally valuable to AI" and "dopamine-driven energy leads to intention" which is somehow synonymous with "feeling" and therefore consciousness, but these points he's making aren't well-supported, so unless there's more that we're not seeing, the dismissal of consciousness in AI is pretty thin as presented.
In my opinion, it doesn't seem likely that what we currently know as AI would have something that could reasonably be called "consciousness", but a different reply above brought up an interesting point - when a series of increasingly nuanced pass/fail logical operations gets you to complex formulations that appear indistinguishable from thought, what is that exactly? It's hard to know how we would really separate that sort of "instantaneous operational output" from consciousness if it became sophisticated enough. And with an AI, just given how fast it could learn, it almost certainly would become that sophisticated, and incredibly quickly at that.
In a lot of ways, it doesn't seem all that different from arguments surrounding strong determinism in regards to free will. We really don't know how "rigid" our own conscious processes are, or how beholden they might be to small-scale neurochemical interactions that we're unable to observe or influence directly. If it turns out that our consciousness is emerging as something like "macro-level" awareness arising from strongly-determined neurochemical interactions, it's difficult to see how that sort of scenario is all that much different from an AI running billions of logical operations around a problem to arrive at an "answer" that could appear as nuanced and emotional as our conscious thoughts ever did. The definition of consciousness might have to be expanded, but I don't think it's a wild enough stretch to assume that it's "breathless panic" to wonder about it. I think we agree that the article isn't all that great.
Viewing a single comment thread. View all comments