Viewing a single comment thread. View all comments

AdditionalPizza OP t1_ivzza75 wrote

The definition of AGI is an AI that can learn any task a human can. Most people presume that would mean the AI would also have to be equal or greater to a human at those tasks.

I don't know where the idea came that AGI has to be conscious. As far as I'm aware that's never been the definition. It's a talking point often associated with AGI and mentioned for Turing Tests, but contrary to your experience I've never heard anyone claim it's a requirement of AGI outside of this sub.

I also see other mixed up definitions in this sub. A lot of people refer to the singularity as the years (or decades) leading up to the actual moment of the singularity.

7