Viewing a single comment thread. View all comments

Objective_Fox_6321 t1_jdrfn25 wrote

It's really simple, actually, LLM isn't doing the math it's only goal is to guess what word/token comes next. Depending on the temperature and other internal factors, LLMs output the most weighed answer.

It's not like an LLM has a built-in Calculator unless it's specifically told to do so, by the user.

With lang-chain, however, you can definitely achieve the goal of having an LLM execute a prompt, import code, open a library, etc, and have it perform non-native tasks.

But you need to realize an LLM is more like a mad lib generator, fine-tuned with specific weights in mind for explicit language. Its goal is to understand the text and predict the next word/token in accordance with its parameters.

6