Viewing a single comment thread. View all comments

konwiddak t1_j5w49gs wrote

Humor is a good test that a language model is able to create subtle and intricate links between words and concepts - but it doesn't directly link to sentience. Something like GPT-3 could probably be adapted to write decent jokes, it's an incredible language model that at first can appear sentient. However it's not sentient because it's just a model where an input maps to a deterministic output. There's no continuous loop of input-learning-adaptation-output that comes with a sentient being. The learning process was a one shot process until the model is updated.

4