Viewing a single comment thread. View all comments

michael_mullet t1_ix18qed wrote

Reply to comment by TemetN in 2023 predictions by ryusan8989

I think I understand you. Likely scale is all that is needed for a non-volitional AGI, and that may be all that is needed for accelerating technological change. Humans can provide the volitional aspect of the system.

7

-ZeroRelevance- t1_ix1crev wrote

Do we even want a volitional AGI though? A non-volitional AGI seems like all the benefits with none of the problems. Since the main draw of an AGI is the problem-solving aspects, which you don’t need volition for.

Also, it shouldn’t have any problems pretending to be one if we want it to though, given how current language models already make very convincing chatbots. It’s just that in such a case, we’d ultimately stay in control, since a non-volitional AI would have no actual desires for things like self-preservation

9

TemetN t1_ix1fdw6 wrote

This. Plus I think that volition is unlikely to be simply emergent, which means that it's likely to take its own research. And I don't see a lot of call for, or effort at researching in such a direction (Numenta? Mostly Numenta).

5