Technologenesis
Submitted by Technologenesis t3_125wzvw in singularity
Technologenesis t1_j33y3xa wrote
Reply to comment by LarsPensjo in I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee
It doesn't seem to be reflecting on its own thinking so much as reflecting on the output it's already generated. It can't look back at its prior thought process and go "D'oh! What was I thinking?", but it can look at the program it just printed out and find the error.
But it seems to me that ChatGPT is just not built to look at its own "cognition", that information just shouldn't be available to it. Each time it answers a prompt it's just reading back through the entire conversation as though it's never seen it before; it doesn't remember generating the responses in the first place.
I can't be sure but I would guess that the only way ChatGPT even knows that it's an AI called ChatGPT in the first place is because OpenAI explicitly built that knowledge into it*. It doesn't know about its own capabilities by performing self-reflection - it's either got that information built into it, or it's inferring it from its training data (for instance, it may have learned general facts about language models from its training data and would know, given that it itself is a language model, that those facts would apply to itself).
*EDIT: I looked it up and in fact the reason it knows these things about itself is because it was fine-tuned using sample conversations in which human beings roleplayed as an AI assistant. So it doesn't know what it is through self-reflection; it essentially knows it because it learned by example how to identify itself)
Technologenesis t1_j33xi6n wrote
Reply to comment by PIPPIPPIPPIPPIP555 in I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee
Lots of philosophers have differing definitions of consciousness. I ultimately agree that subjective experience is a necessary component, but there are others who want to construe consciousness in more concrete physical / functional terms.
Technologenesis t1_j33wdnr wrote
Reply to comment by SeaBearsFoam in I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee
I think that would depend on how loosely you define a sensor. Somehow, your messages are getting to ChatGPT for it to process them. This could be considered a kind of sensation.
Technologenesis t1_jeabp8f wrote
Reply to comment by Visszage in TIL that the world's largest snowflake on record measured 15 inches wide and 8 inches thick. It fell in Fort Keogh, Montana in 1887 and was reported to be "larger than milk pans." by KodyBerns99
Hell yeah, nothing like some milk steak after a nice session of magnets and ghouls