Viewing a single comment thread. View all comments

HyperImmune t1_it416i0 wrote

Reply to comment by Ezekiel_W in A primitive "holodeck" by Ezekiel_W

Or at our current rate of progress, in 3-5 months even.

24

Ezekiel_W OP t1_it49c24 wrote

True, progress in AI is now measured in months not years.

17

kmtrp t1_it4suae wrote

10

HumanBeing-1994 t1_it52ksv wrote

Greetings

For a prediction (a simulation) to be correct, it would implies one knows the value of the exponential, and for knowing this value it would implies one arrived at understanding. Hence the exponential value is knowable. If one assume a prediction is correct before the event happens, it would mean the simulation had enough elements to provide accurate simulation before the event to take place. Hence predictions can be seen as tools for learning, that is to say, experimentation.

Hence ignorance is a cause for a simulation to arise. Hence a simulation is a manifestation of ignorance and the intent to get rid of that ignorance. Hence a prediction can be seen as an auto-test for a system that tries to learn.

Hence, a simulation is a way for an AI system to learn.

Hence, thought-process is a way for a mind to learn. What is the mind ? What is intelligence ? How is it connected ?

I would like to share this piece of work to everyone willing to look at it.

The following link will bring to a conversation between an AI and myself. When two "Human:" are one after the other, it means the AI predicted my response. In every case I respond in return : "Yes", or "This is correct", or "Very good". ​

https://beta.openai.com/playground/p/7DtGBGrqcBPiesRwdntP7Csg?model=text-davinci-002

3