Viewing a single comment thread. View all comments

visarga t1_jb79kst wrote

Back-propagation is self-modifying code. There is also meta-back-propagation for meta-learning, which is learning to modify a neural network to solve novel tasks.

At a higher level, language models trained on code can cultivate a population of models with evolutionary techniques.

Evolution through Large Models

4

NothingVerySpecific t1_jb92tmn wrote

I understand some of those words

3

ahtoshkaa2 t1_jb9czan wrote

Same) Haha. Thank god for ChatGPT:

The comment is referring to two different machine learning concepts: back-propagation and meta-back-propagation, and how they can be used to modify neural networks.

Back-propagation is a supervised learning algorithm used in training artificial neural networks. It is used to modify the weights and biases of the neurons in the network so that the network can produce the desired output for a given input. The algorithm uses gradient descent to calculate the error between the predicted output and the actual output, and then adjusts the weights and biases accordingly.

Meta-back-propagation is an extension of back-propagation that is used for meta-learning, which is learning to learn. It involves modifying the neural network so that it can learn to perform novel tasks more efficiently.

The comment also mentions using evolutionary techniques to cultivate a population of models in language models trained on code. This refers to using genetic algorithms to evolve a population of neural networks, where the best-performing networks are selected and combined to create new generations of networks. This process is known as evolution through large models.

7