Viewing a single comment thread. View all comments

Schopenschluter t1_j6oqfwf wrote

> timeless, thoughtless void

I would argue that time is absolutely essential to anything we call experience and consciousness—these only take place in time. Dreamless sleep is neither experience nor consciousness, but really the absence thereof. We don’t really know what it’s like to be in this “inanimate” state because we always reconstruct it after the fact through metaphors and negations (timeless, thoughtless, dreamless).

In other words, I don’t think this is evidence for panpsychism but rather demonstrates that humans consciousness shuts down completely at times. So saying that it is akin to the consciousness of, say, a stone would be to say that a stone doesn’t have consciousness at all.

2

tkuiper t1_j6otjpd wrote

But I would also say we experience middling states between dreamless and fully conscious. Within dreams, partial lucidity, or heavy inebriation all have fragmented/shortened/discontinuous senses of time. In those states my consciousness is definitely less complete, but still present. Unconsciousness represents the lower limit of the scale, but is not conceptually separate from the scale.

What I derive from this is that anything can be considered conscious, so the magnitude is what we really need to consider. AI is already conscious, but so are ants. We don't give much weight to the consciousness of ants because it's a very dim level. A conscious like a computer for example, has no sense of displeasure at all. It's conscious but not in a way that invites moral concern, which I think is what we're getting at. When do we need to extend moral considerations to AI. If we keep AI emotionally inert, we don't need to regardless of how intelligent it becomes. We also will have a hard time grasping its values, which is an entirely different type of hazard.

2

Schopenschluter t1_j6ozacy wrote

I totally agree about middling and “dim” states of consciousness but I don’t agree that experience or consciousness takes place at the lowest limit of the scale, where there would be zero temporality or awareness thereof.

In this sense, I think of the “scale” of consciousness more like a dimmable light switch: you can bring it very very close to the bottom and still have some light, but when you finally push it all the way down, the light goes out.

Are computers aware (however dimly) of their processing happening in time, or does it just happen? That, to me, is the fundamental question.

1