Viewing a single comment thread. View all comments

kermunnist t1_j8dm21f wrote

Does AGI need to necessarily be sentient? Could a very powerful and reliable LLM that can be accurately trained on any human task without actually being sentient or self aware be considered AGI? To me that's not only AGI, but a better AGI because now there's no ethical dilemmas.

9

el_chaquiste t1_j8e0q6b wrote

I think it doesn't need to have a consciousness to have sentient-like behaviors. It can be a philosophical zombie, copying most if not al of the behaviors of a conscious being, but devoid of it and showing it in some interactions like this.

It may happen consciousness is a casual byproduct of the neural networks required for our intelligence, and we might very well have survived without.

6

Lawjarp2 t1_j8e1m4t wrote

To be truly general and not a wide narrow Intelligence it needs to have a concept of self. Which is widely believed to give you sentience.

It could have sentience and still be controlled. Is it ethical? I'd like to think it's as ethical as having pets or farming and eating billions of animals.

As these models get better they will eventually be given true episodic memory(a sense of time if you will) and ability to rethink. A sense of self should arise from it.

3

Capitaclism t1_j8fxo1s wrote

Eventually we will be farmed, or eaten, or simply left aside.

1

Naomi2221 t1_j8gtqzg wrote

I fear intelligence without awareness much more than awareness. It is action without awareness that causes cruelty and harm.

1

Capitaclism t1_j8gvfi4 wrote

Sort of, yes. It's the people behind the acts without awareness which cause cruelty and harm. In this case, though, it could be wholly unintentional, akin to the paper clip idea: Tell a super intelligent all powerful unaware being to make the best paper clip and it may achieve do to the doom of us all, using all resources in the process of its goal completion.

I think as a species I don't see how we survive if we don't become integrated with our creation.

2

Naomi2221 t1_j8gvoxh wrote

Open to that. And I am also open to awareness being something that emerges from a complex enough neural network with spontaneous world models.

The two aren't mutually exclusive.

1