iamtheonewhorox
iamtheonewhorox t1_j66is99 wrote
Thanks ChatGPT for a well-written, informative post and the OP for providing the posting hardware interface.
iamtheonewhorox t1_j66ilge wrote
One of the few well-written, well-considered and informative posts on this sub or anywhere else on Reddit!
iamtheonewhorox t1_j1qn37r wrote
I think that considering it's the V.1 of the first model of its kind, it did pretty well to tell a joke that almost works. Just the fact that it can create something like that on its own and not simply regurgitate jokes that it finds in its data set is pretty amazing.
iamtheonewhorox t1_jdyclah wrote
Reply to The goalposts for "I'll believe it's real AI when..." have moved to "literally duplicate Einstein" by Yuli-Ban
The primary argument that LLMs are "simply" very sophisticated next word predictors misses the point on several levels simultaneously.
First, there's plenty of evidence that that's more or less just what human brain-minds "simply" do. Or at least, a very large part of the process. The human mind "simply" heuristically imputes all kinds of visual and audio data that is not actually received as signal. It fills in the gaps. Mostly, it works. Sometimes, it creates hallucinated results.
Second, the most advanced scientists working in the field on these models are clear that they do not know how they work. There is a definite black box quality where the process of producing the output is "simply" unknown and possibly unknowable. There is an emergent property to the process and the output that is not directly related to the base function of next word prediction...just as the output of human minds is not a direct property of its heuristic functioning. There is a process of dynamic, self-organizing emergence at play that is not a "simple" input-output function,
Anyone who "simply" spends enough time with these models and pushes their boundaries can observe this. But if you "simply" take a reductionist, deterministic, mechanistic view of a system that is none of those things, you are "simply" going to miss the point.