Viewing a single comment thread. View all comments

ArcticWinterZzZ t1_jdt0dyi wrote

You are correct in that chain of thought prompting does work for this. That's because it gives it more time to run an algorithm to get the answer. I'm specifically talking about "instant" multiplication. Yes, GPT-4 can multiply, so long as it runs the algorithm for it manually. We then run into a small hitch because it will eventually hit its context window, but this can be circumvented. Reflexion and similar methods will also help to circumvent this.

As for SIMPLE specific tasks, I really don't think there's any GPT-4 can't do, not with an introspection step, at least.

2

Kolinnor t1_jdughns wrote

But I don't understand your point ? Humans don't do instant multiplication. At best, we have some mental tricks that are certainly algorithms too. Or we choose wisely to allocate more effort doing long multiplication if needed.

1