Viewing a single comment thread. View all comments

phillythompson t1_jar7p83 wrote

We don’t know how other minds work, either. Animals and all that you listed, I mean.

And complexity doesn’t imply… anything, really. And you have a misunderstanding of what LLMs do — they aren’t “memorizing” necessarily. They are predicting the next text based on a massive amount of data and then a given input.

I’d argue that it’s not clear we are any different than that. Note I’m not claiming we are the same! I am simply saying I don’t see evidence to say with certainty that we are different / special.

1