Viewing a single comment thread. View all comments

Metacognitor t1_j941yl1 wrote

Reply to comment by KPTN25 in [D] Please stop by [deleted]

My question was more rhetorical, as in, what would be capable of producing sentience? Because I don't believe anyone actually knows, which makes any definitive statements of the nature (like yours above) come across as presumptuous. Just my opinion.

1

KPTN25 t1_j94a1y0 wrote

Nah. Negatives are a lot easier to prove than positives in this case. LLMs aren't able to produce sentience for the same reason a peanut butter sandwich can't produce sentience.

Just because I don't know positively how to achieve eternal youth, doesn't invalidate the fact that I'm quite confident it isn't McDonalds.

3

Metacognitor t1_j94ois4 wrote

That's a fair enough point, I can see where you're coming from on that. Although my perspective is perhaps as the models become increasingly large, to the point of being almost entirely a "black box" from a dev perspective, maybe something resembling sentience could emerge spontaneously as a function of some type of self-referential or evaluative model within the primary. It would obviously be a more limited form of sentience (not human-level) but perhaps.

0