EvilKatta

EvilKatta t1_jdr3atm wrote

Humans process language multi-modally. We don't just predicts the next word (although we do this as well), we also visualize. We decode language as images projected at an internal screen that we're not consciously aware of (read Louder Than Words by B. Bergen on that). We can imagine 2 as two objects, 3 as three, imagine all kinds of transformations and rotations of said objects and use all kinds of internal shortcuts to do arithmetic.

Or we can take a calculator and use that. It's another thing that language models lack, even though they're run on a "computer".

I believe when AIs will be given these capabilities, they will do math "out of the box" no problem.

6

EvilKatta t1_j9inlfj wrote

Predictably, you can't answer this question without defining emotions or at least the lack of emotions.

Let me try: emotions are an extrarational drive that informs the thinking process. This drive is consistent (i.e. follows some kind of logic), but doesn't come from the thought process. It co-pilots decision making, for example it "punishes" the rational mind for "wrong" decisions, "rewards" it for good and timely outcomes, etc.

Right now, AIs basically have their training and user prompts for that. In the future, self-guided AIs will have their training frameworks in place, like a set of moral values. So I think yes, one way you can describe it is "having emotions".

1

EvilKatta t1_j99kpma wrote

Even with more primitive AI systems like AI Dungeon you can have fun and gain insights in a conversation. Actually, I think you can do this with a piece of paper if you establish the right process. We humans really do live in our heads, and we don't need much beyond permission to explore our headspace. That's probably where the practice of augury comes from.

1