Viewing a single comment thread. View all comments

phriot t1_ivzmj0u wrote

I feel like you focused on me leaving "level" out of that sentence, where I included it earlier in my comment. You're basically just saying that your definition of AGI is more literal than the one I use. The point of my comment was just that, up until maybe finding this subreddit, every time I saw AGI used, it had the connotation of consciousness.

It's probably splitting hairs, but it seems like people here just want to call any sufficiently good general piece of software "AGI." Yes, a really great General Artificial Intelligence will help us in many areas, but it's not what I've always understood "AGI" to be.

2

AdditionalPizza OP t1_ivzza75 wrote

The definition of AGI is an AI that can learn any task a human can. Most people presume that would mean the AI would also have to be equal or greater to a human at those tasks.

I don't know where the idea came that AGI has to be conscious. As far as I'm aware that's never been the definition. It's a talking point often associated with AGI and mentioned for Turing Tests, but contrary to your experience I've never heard anyone claim it's a requirement of AGI outside of this sub.

I also see other mixed up definitions in this sub. A lot of people refer to the singularity as the years (or decades) leading up to the actual moment of the singularity.

7