Viewing a single comment thread. View all comments

BrotherAmazing t1_iypzqk6 wrote

To be fair, there is research into ANNs that adapt their architectures over time or dynamically adapt the plasticity of certain weights while engaged in “lifelong learning”, and groups have built such networks, but these are the exceptions and almost always the architecture gets fixed and weights are just updated with some standard backprop that can lead to the so-called “catastrophic forgetting” when a dataset shifts it’s PDF if you don’t do anything more advanced than the “vanilla” NN setup.

2