Viewing a single comment thread. View all comments

Surur t1_jcztz42 wrote

If you ask a LLM, they would very well assign a probability to 1+1=2. That probability would not be 100, but would be very close.

1

Shiningc OP t1_jczxlg9 wrote

And 1+1=2 is a non-probabilistic answer that can't be come up with probabilities.

1

Surur t1_jd004in wrote

We are going in circles a bit, but your point, of course, is that current AI models cant do symbolic manipulation, which is very evident when they do complex maths.

The real question is however if you can implement a classic algorithm in a probabilistic neural network and the answer, of course, is yes.

Especially Recurrent Neural Networks, which are, in theory, Turing Complete, can emulate any classic computer algorithm, including 1+1.

1

Shiningc OP t1_jd1d3ns wrote

Again, how would you come up with mathematical axioms with just probabilities?

That contradicts the Gödel's incompleteness theorems, which has been mathematically proven that you cannot come up with mathematical axioms within a mathematical system.

Even if you could replicate the biological neural network which happens to be Turing complete, that still says nothing about programming the human-level intelligence, which is a different matter altogether.

1

Surur t1_jd2102f wrote

Are you implying some kind of devine intervention? Because by definition any one turing complete system can emulate any other.

1

Shiningc OP t1_jd217ty wrote

Yes, but in order to emulate something you'd have to program the emulation first.

1

Surur t1_jd219lr wrote

Evolution and exposure to data programmed humans.

1